Results for Open-Source LLM Models
The below captures the benchmark performance of the open-source models. Most of these were run through Ollama.ai on a consumer-grade laptop.
Please note that the below models vary in their "open-source-ness" (what has been actually released) and their licencing terms (what they can be used for). Be careful - some of the below models are for research purposes only (eg, Microsoft Phi).
Reminder: The below scores are on a scale 0-100, where 100 is the best possible score and 0 means the generated code was not even parseable.
# Imports
using JuliaLLMLeaderboard
using CairoMakie, AlgebraOfGraphics
using MarkdownTables, DataFramesMeta
using Statistics: mean, median, quantile;
# ! Configuration
SAVE_PLOTS = false
DIR_RESULTS = joinpath(pkgdir(JuliaLLMLeaderboard), "code_generation")
PAID_MODELS_DEFAULT = [
"gpt-3.5-turbo",
"gpt-3.5-turbo-1106",
"gpt-4-1106-preview",
"mistral-tiny",
"mistral-small",
"mistral-medium",
];
# TODO: add mapping for model sizes to color them
MODEL_SIZES = Dict()
PROMPTS = [
"JuliaExpertCoTTask",
"JuliaExpertAsk",
"InJulia",
"JuliaRecapTask",
"JuliaRecapCoTTask",
];Load Results
Use only the 5 most recent evaluations available for each definition/model/prompt
df = @chain begin
load_evals(DIR_RESULTS; max_history = 5)
@rsubset !any(startswith.(:model, PAID_MODELS_DEFAULT)) && :prompt_label in PROMPTS
end| Row | device | name | model | prompt_label | prompt_strategy | parsed | executed | unit_tests_count | timestamp | unit_tests_passed | experiment | cost | elapsed_seconds | examples_executed | tokens | version_pt | examples_count | version_prompt | parameters | schema | filename | score | temperature | options | top_p |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| String | String | String | String | String | Bool | Bool | Int64 | String | Int64 | String | Float64 | Float64 | Int64 | Array… | String | Int64 | String | Object… | String | String | Float64 | Float64? | Object…? | Float64? | |
| 1 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | InJulia | 1SHOT | true | false | 4 | 20231213_230806__193 | 0 | 0.0 | 8.65479 | 0 | [82, 255] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231213_230806__193.json | 25.0 | missing | missing | missing | |
| 2 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | InJulia | 1SHOT | true | true | 4 | 20231224_215713__453 | 0 | 0.0 | 16.5864 | 0 | [90, 299] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231224_215713__453.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | InJulia | 1SHOT | true | true | 4 | 20231224_215726__570 | 0 | 0.0 | 12.339 | 0 | [90, 220] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231224_215726__570.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | InJulia | 1SHOT | true | true | 4 | 20231226_205306__662 | 0 | 0.0 | 16.6737 | 0 | [90, 302] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_205306__662.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_230757__380 | 0 | 0.0 | 7.88754 | 0 | [112, 222] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231213_230757__380.json | 50.0 | missing | missing | missing | |
| 6 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_215643__493 | 1 | 0.0 | 9.52042 | 3 | [129, 157] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231224_215643__493.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 7 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_215657__736 | 0 | 0.0 | 12.8782 | 0 | [129, 220] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231224_215657__736.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 8 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_205249__838 | 1 | 0.0 | 7.94984 | 1 | [129, 128] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_205249__838.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 9 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_230749__984 | 0 | 0.0 | 22.7754 | 0 | [239, 589] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231213_230749__984.json | 50.0 | missing | missing | missing | |
| 10 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_215629__136 | 0 | 0.0 | 15.2446 | 0 | [257, 52] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231224_215629__136.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 11 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_215634__384 | 0 | 0.0 | 4.59877 | 0 | [257, 43] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231224_215634__384.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 12 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_205241__762 | 0 | 0.0 | 21.5078 | 0 | [257, 181] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_205241__762.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 13 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_230900__251 | 0 | 0.0 | 17.932 | 0 | [11, 487] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231213_230900__251.json | 50.0 | missing | missing | missing | |
| 14 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_215850__983 | 1 | 0.0 | 20.522 | 1 | [394, 311] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231224_215850__983.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 15 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_215919__856 | 0 | 0.0 | 28.4554 | 0 | [394, 449] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231224_215919__856.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 16 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_205426__535 | 0 | 0.0 | 23.2939 | 0 | [394, 361] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_205426__535.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 17 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 4 | 20231213_230842__931 | 0 | 0.0 | 22.6544 | 0 | [383, 525] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231213_230842__931.json | 0.0 | missing | missing | missing | |
| 18 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_215809__525 | 0 | 0.0 | 18.1071 | 0 | [391, 268] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231224_215809__525.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 19 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_215830__702 | 0 | 0.0 | 20.8176 | 0 | [391, 316] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231224_215830__702.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 20 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_205402__392 | 4 | 0.0 | 55.6299 | 4 | [391, 904] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_205402__392.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 21 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | InJulia | 1SHOT | true | false | 4 | 20231213_230946__496 | 0 | 0.0 | 17.0295 | 0 | [82, 499] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__InJulia__1SHOT__20231213_230946__496.json | 25.0 | missing | missing | missing | |
| 22 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | InJulia | 1SHOT | false | false | 4 | 20231224_220016__286 | 0 | 0.0 | 12.569 | 0 | [64, 230] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__InJulia__1SHOT__20231224_220016__286.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 23 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | InJulia | 1SHOT | false | false | 4 | 20231224_220019__990 | 0 | 0.0 | 3.50014 | 0 | [64, 57] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__InJulia__1SHOT__20231224_220019__990.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 24 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231213_230929__136 | 0 | 0.0 | 7.67115 | 0 | [112, 215] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231213_230929__136.json | 25.0 | missing | missing | missing | |
| 25 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231224_215950__869 | 0 | 0.0 | 9.74012 | 0 | [66, 172] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231224_215950__869.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 26 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_220003__975 | 0 | 0.0 | 13.2863 | 0 | [66, 239] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231224_220003__975.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 27 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_230921__834 | 0 | 0.0 | 20.6148 | 0 | [239, 530] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231213_230921__834.json | 50.0 | missing | missing | missing | |
| 28 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_215937__260 | 0 | 0.0 | 18.0286 | 0 | [132, 126] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231224_215937__260.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 29 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_215940__811 | 0 | 0.0 | 3.16863 | 0 | [132, 36] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231224_215940__811.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 30 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_231040__282 | 0 | 0.0 | 17.9712 | 0 | [11, 488] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231040__282.json | 50.0 | missing | missing | missing | |
| 31 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_220130__602 | 0 | 0.0 | 1.38221 | 0 | [83, 11] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220130__602.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 32 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_220132__819 | 0 | 0.0 | 2.63389 | 0 | [83, 35] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220132__819.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 33 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 4 | 20231213_231022__289 | 0 | 0.0 | 22.288 | 0 | [383, 515] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231213_231022__289.json | 0.0 | missing | missing | missing | |
| 34 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220118__216 | 0 | 0.0 | 3.19781 | 0 | [80, 46] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231224_220118__216.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 35 | Apple-MacBook-Pro-M1 | add_yearmonth | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220128__849 | 0 | 0.0 | 10.8613 | 0 | [80, 193] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231224_220128__849.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 36 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231219_205930__896 | 0 | 0.0 | 12.9329 | 0 | [1, 399] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_205930__896.json | 25.0 | missing | missing | missing | |
| 37 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231219_205938__989 | 0 | 0.0 | 8.49001 | 0 | [1, 260] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_205938__989.json | 25.0 | missing | missing | missing | |
| 38 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231224_222301__332 | 0 | 0.0 | 50.0536 | 0 | [83, 287] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_222301__332.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 39 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231224_222342__552 | 0 | 0.0 | 39.2743 | 0 | [83, 229] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_222342__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 40 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231226_210403__146 | 0 | 0.0 | 32.3636 | 0 | [83, 191] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_210403__146.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 41 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231219_205855__890 | 0 | 0.0 | 7.41282 | 0 | [1, 233] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_205855__890.json | 25.0 | missing | missing | missing | |
| 42 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231219_205901__340 | 0 | 0.0 | 5.4559 | 0 | [1, 172] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_205901__340.json | 0.0 | missing | missing | missing | |
| 43 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_222043__593 | 0 | 0.0 | 37.0856 | 0 | [124, 193] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_222043__593.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 44 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_222211__903 | 0 | 0.0 | 88.1791 | 0 | [124, 482] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_222211__903.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 45 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_210331__537 | 0 | 0.0 | 46.3106 | 0 | [124, 272] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_210331__537.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 46 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231219_205826__211 | 0 | 0.0 | 14.1975 | 0 | [1, 401] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_205826__211.json | 25.0 | missing | missing | missing | |
| 47 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231219_205837__773 | 0 | 0.0 | 11.4811 | 0 | [1, 340] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_205837__773.json | 25.0 | missing | missing | missing | |
| 48 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_221918__361 | 4 | 0.0 | 80.9647 | 4 | [252, 222] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_221918__361.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 49 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_222005__409 | 4 | 0.0 | 46.583 | 4 | [252, 217] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_222005__409.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 50 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_210244__148 | 4 | 0.0 | 77.3058 | 4 | [252, 283] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_210244__148.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 51 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231219_210157__128 | 0 | 0.0 | 20.067 | 0 | [1, 549] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_210157__128.json | 25.0 | missing | missing | missing | |
| 52 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231219_210215__790 | 0 | 0.0 | 18.1918 | 0 | [1, 501] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_210215__790.json | 0.0 | missing | missing | missing | |
| 53 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_222805__507 | 4 | 0.0 | 80.8156 | 4 | [412, 425] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_222805__507.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 54 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_222824__706 | 1 | 0.0 | 19.5209 | 3 | [412, 58] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_222824__706.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 55 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_210644__755 | 4 | 0.0 | 56.8096 | 4 | [412, 285] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_210644__755.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 56 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231219_210059__809 | 0 | 0.0 | 17.7074 | 0 | [1, 489] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_210059__809.json | 25.0 | missing | missing | missing | |
| 57 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231219_210122__439 | 0 | 0.0 | 22.2271 | 0 | [1, 603] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_210122__439.json | 25.0 | missing | missing | missing | |
| 58 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_222548__552 | 0 | 0.0 | 32.3384 | 0 | [410, 135] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_222548__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 59 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_222643__363 | 0 | 0.0 | 54.1194 | 0 | [410, 268] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_222643__363.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 60 | Apple-MacBook-Pro-M1 | add_yearmonth | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_210547__589 | 0 | 0.0 | 103.052 | 0 | [410, 559] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_210547__589.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 61 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 4 | 20231226_211452__883 | 0 | 0.0 | 5.42072 | 0 | [85, 207] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_211452__883.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 62 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 4 | 20231227_095541__986 | 0 | 0.0 | 13.561 | 0 | [85, 516] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_095541__986.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 63 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 4 | 20231227_095551__780 | 0 | 0.0 | 9.78193 | 0 | [85, 375] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_095551__780.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 64 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 4 | 20231227_095559__848 | 0 | 0.0 | 8.36018 | 0 | [85, 321] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_095559__848.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 65 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_211446__834 | 0 | 0.0 | 1.89911 | 0 | [122, 63] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_211446__834.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 66 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_095522__559 | 0 | 0.0 | 7.51261 | 0 | [122, 283] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_095522__559.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 67 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_095526__194 | 0 | 0.0 | 4.2142 | 0 | [122, 155] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_095526__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 68 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_095527__797 | 0 | 0.0 | 1.5631 | 0 | [122, 49] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_095527__797.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 69 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_211444__679 | 0 | 0.0 | 10.4501 | 0 | [234, 246] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211444__679.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 70 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231227_095458__906 | 0 | 0.0 | 9.44909 | 0 | [234, 212] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095458__906.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 71 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_095508__391 | 0 | 0.0 | 10.0284 | 0 | [234, 356] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095508__391.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 72 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_095514__173 | 0 | 0.0 | 6.26306 | 0 | [234, 215] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095514__173.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 73 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_211518__804 | 0 | 0.0 | 13.9077 | 0 | [374, 465] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211518__804.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 74 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_095631__152 | 0 | 0.0 | 10.6941 | 0 | [374, 352] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_095631__152.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 75 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_095636__478 | 0 | 0.0 | 4.92165 | 0 | [374, 141] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_095636__478.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 76 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_095646__355 | 0 | 0.0 | 10.7356 | 0 | [374, 353] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_095646__355.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 77 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20231226_211504__648 | 0 | 0.0 | 12.2395 | 0 | [371, 407] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_211504__648.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 78 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20231227_095606__405 | 0 | 0.0 | 6.97045 | 0 | [371, 217] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_095606__405.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 79 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_095613__194 | 0 | 0.0 | 6.77164 | 0 | [371, 210] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_095613__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 80 | Apple-MacBook-Pro-M1 | add_yearmonth | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 4 | 20231227_095620__390 | 0 | 0.0 | 6.74354 | 0 | [371, 209] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_095620__390.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 81 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | InJulia | 1SHOT | true | false | 4 | 20231224_214015__496 | 0 | 0.0 | 7.84819 | 0 | [82, 232] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__InJulia__1SHOT__20231224_214015__496.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 82 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | InJulia | 1SHOT | true | false | 4 | 20231224_214035__183 | 0 | 0.0 | 19.7087 | 0 | [1, 589] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__InJulia__1SHOT__20231224_214035__183.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 83 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | InJulia | 1SHOT | true | false | 4 | 20231226_203505__279 | 0 | 0.0 | 17.0546 | 0 | [82, 502] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__InJulia__1SHOT__20231226_203505__279.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 84 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | InJulia | 1SHOT | true | false | 4 | 20231226_203516__281 | 0 | 0.0 | 10.5221 | 0 | [1, 324] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__InJulia__1SHOT__20231226_203516__281.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 85 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | InJulia | 1SHOT | true | false | 4 | 20231226_204455__223 | 0 | 0.0 | 32.6698 | 0 | [82, 909] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__InJulia__1SHOT__20231226_204455__223.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 86 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_213957__497 | 0 | 0.0 | 9.37755 | 0 | [112, 267] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_213957__497.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 87 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231224_214007__226 | 0 | 0.0 | 8.22632 | 0 | [1, 253] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_214007__226.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 88 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_203439__916 | 0 | 0.0 | 7.46196 | 0 | [112, 209] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_203439__916.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 89 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231226_203448__728 | 0 | 0.0 | 9.34418 | 0 | [1, 289] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_203448__728.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 90 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_204422__111 | 0 | 0.0 | 6.61563 | 0 | [112, 181] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_204422__111.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 91 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_213929__413 | 0 | 0.0 | 22.6526 | 0 | [257, 348] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_213929__413.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 92 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231224_213947__378 | 0 | 0.0 | 17.3161 | 0 | [1, 493] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_213947__378.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 93 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231226_203420__902 | 0 | 0.0 | 20.0184 | 0 | [257, 366] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_203420__902.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 94 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_203431__638 | 0 | 0.0 | 11.5576 | 0 | [1, 339] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_203431__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 95 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_204414__832 | 0 | 0.0 | 23.9462 | 0 | [257, 395] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_204414__832.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 96 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_230308__132 | 0 | 0.0 | 14.0784 | 0 | [11, 382] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_230308__132.json | 50.0 | missing | missing | missing | |
| 97 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_214151__944 | 0 | 0.0 | 16.1789 | 0 | [11, 442] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214151__944.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 98 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_214220__279 | 0 | 0.0 | 29.3745 | 0 | [1, 774] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214220__279.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 99 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_203647__528 | 0 | 0.0 | 19.107 | 0 | [11, 519] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_203647__528.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 100 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_204536__900 | 0 | 0.0 | 18.173 | 0 | [11, 497] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_204536__900.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 101 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapTask | 1SHOT | true | false | 4 | 20231224_214121__105 | 0 | 0.0 | 17.319 | 0 | [383, 387] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_214121__105.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 102 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_214135__489 | 0 | 0.0 | 13.7519 | 0 | [1, 385] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_214135__489.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 103 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_203603__890 | 0 | 0.0 | 23.746 | 0 | [383, 554] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_203603__890.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 104 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapTask | 1SHOT | false | false | 4 | 20231226_203627__976 | 0 | 0.0 | 24.1853 | 0 | [1, 651] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_203627__976.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 105 | Apple-MacBook-Pro-M1 | add_yearmonth | llama2 | JuliaRecapTask | 1SHOT | false | false | 4 | 20231226_204517__690 | 0 | 0.0 | 22.2887 | 0 | [383, 513] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_204517__690.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 106 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | InJulia | 1SHOT | false | false | 4 | 20231213_231126__920 | 0 | 0.0 | 14.8593 | 0 | [82, 437] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__InJulia__1SHOT__20231213_231126__920.json | 0.0 | missing | missing | missing | |
| 107 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | InJulia | 1SHOT | true | true | 4 | 20231224_220223__782 | 0 | 0.0 | 11.0284 | 0 | [82, 360] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__InJulia__1SHOT__20231224_220223__782.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 108 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | InJulia | 1SHOT | true | true | 4 | 20231224_220229__218 | 0 | 0.0 | 5.76961 | 0 | [82, 183] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__InJulia__1SHOT__20231224_220229__218.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 109 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | InJulia | 1SHOT | true | true | 4 | 20231226_205457__290 | 0 | 0.0 | 6.93698 | 0 | [82, 224] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__InJulia__1SHOT__20231226_205457__290.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 110 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_231111__168 | 0 | 0.0 | 9.17501 | 0 | [112, 260] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231213_231111__168.json | 50.0 | missing | missing | missing | |
| 111 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_220207__671 | 0 | 0.0 | 7.37878 | 0 | [122, 232] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231224_220207__671.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 112 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_220212__569 | 0 | 0.0 | 4.6755 | 0 | [122, 141] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231224_220212__569.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 113 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231226_205450__204 | 0 | 0.0 | 8.69468 | 0 | [122, 277] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_205450__204.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 114 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231213_231102__246 | 0 | 0.0 | 21.7643 | 0 | [239, 563] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231102__246.json | 25.0 | missing | missing | missing | |
| 115 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231224_220149__428 | 0 | 0.0 | 16.5262 | 0 | [249, 313] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220149__428.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 116 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_220159__582 | 0 | 0.0 | 10.3947 | 0 | [249, 308] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220159__582.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 117 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_205441__853 | 0 | 0.0 | 15.2213 | 0 | [249, 277] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_205441__853.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 118 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_231216__154 | 0 | 0.0 | 20.0306 | 0 | [11, 540] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231216__154.json | 50.0 | missing | missing | missing | |
| 119 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_220311__471 | 0 | 0.0 | 11.0945 | 0 | [386, 302] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220311__471.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 120 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_220324__277 | 0 | 0.0 | 12.1812 | 0 | [386, 336] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220324__277.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 121 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_205513__194 | 0 | 0.0 | 10.079 | 0 | [386, 271] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_205513__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 122 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_231156__788 | 0 | 0.0 | 18.1615 | 0 | [383, 410] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapTask__1SHOT__20231213_231156__788.json | 50.0 | missing | missing | missing | |
| 123 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapTask | 1SHOT | true | false | 4 | 20231224_220251__379 | 0 | 0.0 | 8.74037 | 0 | [383, 232] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapTask__1SHOT__20231224_220251__379.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 124 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_220300__977 | 0 | 0.0 | 9.06196 | 0 | [383, 242] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapTask__1SHOT__20231224_220300__977.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 125 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_205503__695 | 0 | 0.0 | 5.89635 | 0 | [383, 141] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_205503__695.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 126 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 4 | 20231227_174325__527 | 0 | 0.0 | 11.9271 | 0 | [82, 223] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174325__527.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 127 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 4 | 20231227_174339__424 | 0 | 0.0 | 14.3941 | 0 | [82, 276] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174339__424.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 128 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 4 | 20231227_174354__564 | 0 | 0.0 | 14.4403 | 0 | [82, 279] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174354__564.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 129 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_174254__941 | 0 | 0.0 | 10.6599 | 0 | [122, 199] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174254__941.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 130 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_174303__589 | 0 | 0.0 | 8.87612 | 0 | [122, 163] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174303__589.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 131 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_174313__328 | 0 | 0.0 | 9.56752 | 0 | [122, 173] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174313__328.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 132 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_174205__934 | 0 | 0.0 | 21.9707 | 0 | [249, 262] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174205__934.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 133 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_174226__547 | 0 | 0.0 | 20.7435 | 0 | [249, 376] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174226__547.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 134 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_174243__107 | 0 | 0.0 | 17.5298 | 0 | [249, 318] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174243__107.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 135 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_174442__730 | 0 | 0.0 | 14.3644 | 0 | [386, 237] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174442__730.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 136 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_174454__806 | 0 | 0.0 | 12.2136 | 0 | [386, 196] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174454__806.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 137 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_174510__987 | 0 | 0.0 | 16.2236 | 0 | [386, 252] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174510__987.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 138 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_174406__571 | 0 | 0.0 | 12.2934 | 0 | [383, 200] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174406__571.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 139 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_174416__969 | 0 | 0.0 | 9.19656 | 0 | [383, 139] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174416__969.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 140 | Apple-MacBook-Pro-M1 | add_yearmonth | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_174427__514 | 0 | 0.0 | 11.4643 | 0 | [383, 180] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174427__514.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 141 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231219_210835__100 | 0 | 0.0 | 17.0624 | 0 | [1, 518] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_210835__100.json | 25.0 | missing | missing | missing | |
| 142 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231219_210845__989 | 0 | 0.0 | 9.95985 | 0 | [1, 313] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_210845__989.json | 25.0 | missing | missing | missing | |
| 143 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231224_223313__643 | 0 | 0.0 | 4.2227 | 0 | [79, 98] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_223313__643.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 144 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231224_223315__988 | 0 | 0.0 | 2.13813 | 0 | [79, 43] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_223315__988.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 145 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231226_210906__347 | 0 | 0.0 | 5.06874 | 0 | [79, 120] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_210906__347.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 146 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231219_210754__275 | 0 | 0.0 | 8.08097 | 0 | [1, 253] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_210754__275.json | 25.0 | missing | missing | missing | |
| 147 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231219_210803__582 | 0 | 0.0 | 8.46745 | 0 | [1, 265] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_210803__582.json | 25.0 | missing | missing | missing | |
| 148 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_223303__394 | 0 | 0.0 | 3.4628 | 0 | [121, 73] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_223303__394.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 149 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_223308__843 | 0 | 0.0 | 5.14864 | 0 | [121, 117] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_223308__843.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 150 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_210901__269 | 0 | 0.0 | 2.93179 | 0 | [121, 59] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_210901__269.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 151 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231219_210716__682 | 0 | 0.0 | 12.2902 | 0 | [1, 362] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_210716__682.json | 25.0 | missing | missing | missing | |
| 152 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231219_210737__395 | 0 | 0.0 | 20.7773 | 0 | [1, 589] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_210737__395.json | 25.0 | missing | missing | missing | |
| 153 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_223243__927 | 0 | 0.0 | 22.7312 | 0 | [248, 403] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223243__927.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 154 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_223300__344 | 0 | 0.0 | 17.1518 | 0 | [248, 403] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223300__344.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 155 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231226_210858__502 | 0 | 0.0 | 23.0942 | 0 | [248, 417] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_210858__502.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 156 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231219_211104__526 | 0 | 0.0 | 22.2904 | 0 | [1, 604] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_211104__526.json | 25.0 | missing | missing | missing | |
| 157 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231219_211126__448 | 0 | 0.0 | 22.0555 | 0 | [1, 598] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_211126__448.json | 25.0 | missing | missing | missing | |
| 158 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_223403__299 | 0 | 0.0 | 10.0877 | 0 | [388, 202] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223403__299.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 159 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_223422__815 | 0 | 0.0 | 19.8235 | 0 | [388, 442] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223422__815.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 160 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_210941__289 | 0 | 0.0 | 19.1932 | 0 | [388, 426] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_210941__289.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 161 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231219_211003__573 | 0 | 0.0 | 20.9196 | 0 | [1, 570] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_211003__573.json | 25.0 | missing | missing | missing | |
| 162 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231219_211021__698 | 0 | 0.0 | 17.9365 | 0 | [1, 495] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_211021__698.json | 25.0 | missing | missing | missing | |
| 163 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231224_223342__907 | 0 | 0.0 | 12.7121 | 0 | [386, 267] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223342__907.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 164 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_223352__876 | 0 | 0.0 | 10.7043 | 0 | [386, 217] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223352__876.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 165 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_210922__804 | 0 | 0.0 | 15.3031 | 0 | [386, 331] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_210922__804.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 166 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 4 | 20231227_222026__914 | 0 | 0.0 | 16.07 | 0 | [78, 472] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_222026__914.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 167 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 4 | 20231227_222038__663 | 0 | 0.0 | 11.5974 | 0 | [78, 337] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_222038__663.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 168 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 4 | 20231227_222054__868 | 0 | 0.0 | 15.346 | 0 | [78, 450] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_222054__868.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 169 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 4 | 20231227_222111__322 | 1 | 0.0 | 17.1671 | 4 | [78, 506] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_222111__322.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 170 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 4 | 20231227_222126__896 | 0 | 0.0 | 14.996 | 0 | [78, 444] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_222126__896.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 171 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_221944__542 | 0 | 0.0 | 5.80932 | 0 | [120, 170] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_221944__542.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 172 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_221950__302 | 0 | 0.0 | 5.80134 | 0 | [120, 162] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_221950__302.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 173 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_221957__826 | 0 | 0.0 | 6.68358 | 0 | [120, 190] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_221957__826.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 174 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_222002__843 | 0 | 0.0 | 5.21487 | 0 | [120, 147] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_222002__843.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 175 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_222010__184 | 0 | 0.0 | 8.17628 | 0 | [120, 229] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_222010__184.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 176 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_221828__501 | 0 | 0.0 | 15.8588 | 0 | [247, 282] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_221828__501.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 177 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_221851__970 | 0 | 0.0 | 22.7724 | 0 | [247, 671] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_221851__970.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 178 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_221906__854 | 0 | 0.0 | 14.1624 | 0 | [247, 411] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_221906__854.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 179 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_221917__230 | 0 | 0.0 | 10.9461 | 0 | [247, 310] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_221917__230.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 180 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_221938__421 | 1 | 0.0 | 21.5188 | 4 | [247, 628] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_221938__421.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 181 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_222348__467 | 1 | 0.0 | 32.3177 | 1 | [387, 887] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_222348__467.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 182 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_222400__440 | 0 | 0.0 | 11.3898 | 0 | [387, 298] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_222400__440.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 183 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_222415__651 | 0 | 0.0 | 14.8602 | 0 | [387, 396] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_222415__651.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 184 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_222427__364 | 0 | 0.0 | 12.4823 | 0 | [387, 320] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_222427__364.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 185 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_222440__895 | 0 | 0.0 | 12.3122 | 0 | [387, 316] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_222440__895.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 186 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_222152__482 | 0 | 0.0 | 25.6324 | 0 | [385, 687] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_222152__482.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 187 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 4 | 20231227_222217__230 | 0 | 0.0 | 25.1869 | 0 | [385, 676] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_222217__230.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 188 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_222240__251 | 0 | 0.0 | 22.8827 | 0 | [385, 616] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_222240__251.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 189 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 4 | 20231227_222259__444 | 0 | 0.0 | 18.7928 | 0 | [385, 502] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_222259__444.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 190 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_222315__866 | 0 | 0.0 | 16.2574 | 0 | [385, 422] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_222315__866.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 191 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20231227_222725__923 | 0 | 0.0 | 24.5615 | 0 | [78, 606] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_222725__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 192 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_222742__187 | 0 | 0.0 | 16.9818 | 0 | [78, 425] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_222742__187.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 193 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_222753__392 | 0 | 0.0 | 10.863 | 0 | [78, 269] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_222753__392.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 194 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_222812__308 | 0 | 0.0 | 18.6186 | 0 | [78, 466] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_222812__308.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 195 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231227_222823__362 | 0 | 0.0 | 10.7015 | 0 | [78, 265] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_222823__362.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 196 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_222627__306 | 0 | 0.0 | 8.62802 | 0 | [120, 204] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_222627__306.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 197 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_222633__642 | 1 | 0.0 | 5.97485 | 1 | [120, 136] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_222633__642.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 198 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_222642__973 | 0 | 0.0 | 7.44011 | 0 | [120, 171] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_222642__973.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 199 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_222649__359 | 0 | 0.0 | 7.05934 | 0 | [120, 164] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_222649__359.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 200 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_222700__372 | 0 | 0.0 | 11.288 | 0 | [120, 272] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_222700__372.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 201 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_222514__189 | 0 | 0.0 | 34.5322 | 0 | [247, 689] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_222514__189.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 202 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_222531__741 | 0 | 0.0 | 16.9664 | 0 | [247, 392] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_222531__741.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 203 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_222554__403 | 0 | 0.0 | 22.0129 | 0 | [247, 515] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_222554__403.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 204 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231227_222605__249 | 0 | 0.0 | 11.5167 | 0 | [247, 257] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_222605__249.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 205 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_222618__692 | 0 | 0.0 | 12.4848 | 0 | [247, 281] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_222618__692.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 206 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231227_223106__197 | 0 | 0.0 | 24.4211 | 0 | [387, 550] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223106__197.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 207 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_223116__630 | 0 | 0.0 | 10.0863 | 0 | [387, 200] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223116__630.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 208 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_223148__194 | 0 | 0.0 | 31.7965 | 0 | [387, 724] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223148__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 209 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_223213__318 | 0 | 0.0 | 24.8027 | 0 | [387, 559] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223213__318.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 210 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_223239__952 | 0 | 0.0 | 25.5192 | 0 | [387, 576] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223239__952.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 211 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231227_222852__562 | 0 | 0.0 | 29.1068 | 0 | [385, 661] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_222852__562.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 212 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_222918__108 | 0 | 0.0 | 26.2944 | 0 | [385, 595] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_222918__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 213 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_222945__681 | 1 | 0.0 | 26.8597 | 2 | [385, 608] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_222945__681.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 214 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_223017__128 | 0 | 0.0 | 31.4353 | 1 | [385, 716] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_223017__128.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 215 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231227_223042__330 | 0 | 0.0 | 24.3946 | 0 | [385, 549] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_223042__330.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 216 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 4 | 20231226_115349__339 | 0 | 0.0 | 32.2254 | 0 | [78, 584] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_115349__339.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 217 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 4 | 20231226_115411__684 | 0 | 0.0 | 21.9377 | 0 | [78, 401] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_115411__684.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 218 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 4 | 20231226_211310__750 | 0 | 0.0 | 24.3564 | 0 | [78, 448] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_211310__750.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 219 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_115305__779 | 0 | 0.0 | 4.33453 | 0 | [120, 67] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_115305__779.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 220 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_115316__941 | 0 | 0.0 | 11.313 | 0 | [120, 196] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_115316__941.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 221 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_211246__297 | 0 | 0.0 | 17.5896 | 0 | [120, 318] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_211246__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 222 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_115225__918 | 0 | 0.0 | 34.5511 | 0 | [247, 392] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_115225__918.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 223 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_115300__231 | 0 | 0.0 | 32.3957 | 3 | [247, 557] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_115300__231.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 224 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_211228__845 | 0 | 0.0 | 28.7122 | 0 | [247, 320] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211228__845.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 225 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_115621__498 | 1 | 0.0 | 23.6615 | 3 | [387, 390] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_115621__498.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 226 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_115655__171 | 0 | 0.0 | 33.4395 | 0 | [387, 561] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_115655__171.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 227 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_211434__485 | 0 | 0.0 | 47.8741 | 0 | [387, 821] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211434__485.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 228 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_115521__407 | 0 | 0.0 | 21.3724 | 0 | [385, 349] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_115521__407.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 229 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_115557__452 | 0 | 0.0 | 34.8083 | 0 | [385, 589] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_115557__452.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 230 | Apple-MacBook-Pro-M1 | add_yearmonth | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_211346__544 | 0 | 0.0 | 35.343 | 0 | [385, 602] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_211346__544.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 231 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231227_100331__531 | 0 | 0.0 | 43.6627 | 0 | [83, 237] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_100331__531.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 232 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_100413__256 | 1 | 0.0 | 42.1394 | 3 | [83, 246] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_100413__256.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 233 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_100451__992 | 0 | 0.0 | 37.2126 | 0 | [83, 216] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_100451__992.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 234 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20231227_140857__638 | 0 | 0.0 | 37.0819 | 0 | [83, 214] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_140857__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 235 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231227_140955__313 | 0 | 0.0 | 57.9009 | 0 | [83, 340] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_140955__313.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 236 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_100136__353 | 0 | 0.0 | 36.6525 | 0 | [122, 200] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_100136__353.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 237 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_100204__483 | 1 | 0.0 | 28.5557 | 1 | [122, 149] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_100204__483.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 238 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_100246__675 | 1 | 0.0 | 41.6302 | 1 | [122, 224] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_100246__675.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 239 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231227_140807__987 | 0 | 0.0 | 32.9359 | 0 | [122, 176] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_140807__987.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 240 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231227_140820__166 | 0 | 0.0 | 11.7228 | 0 | [122, 53] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_140820__166.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 241 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_095754__625 | 1 | 0.0 | 67.743 | 1 | [248, 343] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095754__625.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 242 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_095949__337 | 1 | 0.0 | 114.657 | 4 | [248, 645] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095949__337.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 243 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_100059__994 | 1 | 0.0 | 68.9864 | 1 | [248, 369] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_100059__994.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 244 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231227_140627__899 | 0 | 0.0 | 93.4955 | 0 | [248, 505] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_140627__899.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 245 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231227_140734__156 | 0 | 0.0 | 66.3465 | 0 | [248, 346] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_140734__156.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 246 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_100858__537 | 0 | 0.0 | 59.9875 | 0 | [396, 296] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_100858__537.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 247 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_101043__252 | 0 | 0.0 | 105.007 | 0 | [396, 557] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_101043__252.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 248 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_101207__404 | 0 | 0.0 | 83.6938 | 3 | [396, 434] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_101207__404.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 249 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231227_141410__970 | 0 | 0.0 | 79.4301 | 0 | [396, 407] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_141410__970.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 250 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231227_141523__762 | 0 | 0.0 | 73.4373 | 0 | [396, 372] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_141523__762.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 251 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_100620__397 | 0 | 0.0 | 89.4829 | 0 | [394, 468] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_100620__397.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 252 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_100644__464 | 0 | 0.0 | 23.6161 | 1 | [394, 80] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_100644__464.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 253 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_100757__261 | 1 | 0.0 | 73.2466 | 3 | [394, 374] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_100757__261.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 254 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231227_141057__871 | 0 | 0.0 | 61.9135 | 0 | [394, 305] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_141057__871.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 255 | Apple-MacBook-Pro-M1 | add_yearmonth | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231227_141250__755 | 0 | 0.0 | 112.798 | 0 | [394, 598] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_141250__755.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 256 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 4 | 20231219_211255__302 | 0 | 0.0 | 11.8809 | 0 | [1, 370] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_211255__302.json | 0.0 | missing | missing | missing | |
| 257 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231219_211306__730 | 0 | 0.0 | 11.3203 | 0 | [1, 353] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_211306__730.json | 25.0 | missing | missing | missing | |
| 258 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231224_223527__363 | 0 | 0.0 | 12.7237 | 0 | [87, 318] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231224_223527__363.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 259 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231224_223535__963 | 0 | 0.0 | 7.69095 | 0 | [87, 188] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231224_223535__963.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 260 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231226_211022__802 | 0 | 0.0 | 15.8072 | 0 | [87, 395] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_211022__802.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 261 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231219_211221__787 | 0 | 0.0 | 6.76287 | 0 | [1, 213] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_211221__787.json | 25.0 | missing | missing | missing | |
| 262 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231219_211228__869 | 0 | 0.0 | 7.37654 | 0 | [1, 232] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_211228__869.json | 25.0 | missing | missing | missing | |
| 263 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_223507__957 | 0 | 0.0 | 2.61661 | 0 | [129, 47] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_223507__957.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 264 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_223515__975 | 0 | 0.0 | 7.6359 | 0 | [129, 178] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_223515__975.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 265 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_211006__345 | 0 | 0.0 | 2.62921 | 0 | [129, 47] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_211006__345.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 266 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231219_211156__109 | 0 | 0.0 | 11.5419 | 0 | [1, 341] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_211156__109.json | 25.0 | missing | missing | missing | |
| 267 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231219_211205__741 | 0 | 0.0 | 9.71335 | 0 | [1, 289] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_211205__741.json | 0.0 | missing | missing | missing | |
| 268 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_223445__779 | 1 | 0.0 | 22.1232 | 4 | [256, 368] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223445__779.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 269 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_223504__136 | 0 | 0.0 | 19.2982 | 0 | [256, 455] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223504__136.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 270 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231226_211003__425 | 0 | 0.0 | 22.0489 | 0 | [256, 375] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211003__425.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 271 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231219_211525__380 | 0 | 0.0 | 27.843 | 0 | [1, 739] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_211525__380.json | 25.0 | missing | missing | missing | |
| 272 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231219_211541__280 | 0 | 0.0 | 16.5645 | 0 | [1, 459] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_211541__280.json | 25.0 | missing | missing | missing | |
| 273 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_223642__146 | 0 | 0.0 | 18.2173 | 0 | [396, 402] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223642__146.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 274 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231224_223657__825 | 0 | 0.0 | 14.948 | 0 | [396, 321] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223657__825.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 275 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_211105__230 | 1 | 0.0 | 20.2328 | 1 | [396, 449] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211105__230.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 276 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231219_211416__442 | 0 | 0.0 | 14.5984 | 0 | [1, 408] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_211416__442.json | 25.0 | missing | missing | missing | |
| 277 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231219_211438__774 | 0 | 0.0 | 22.1919 | 0 | [1, 602] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_211438__774.json | 25.0 | missing | missing | missing | |
| 278 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_223611__144 | 0 | 0.0 | 16.9902 | 0 | [394, 372] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223611__144.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 279 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_223624__880 | 0 | 0.0 | 12.7834 | 0 | [394, 268] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223624__880.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 280 | Apple-MacBook-Pro-M1 | add_yearmonth | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_211044__118 | 0 | 0.0 | 22.7122 | 0 | [394, 509] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_211044__118.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 281 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 4 | 20231213_230354__160 | 0 | 0.0 | 17.6194 | 0 | [82, 516] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231213_230354__160.json | 25.0 | missing | missing | missing | |
| 282 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 4 | 20231224_214319__409 | 0 | 0.0 | 6.57793 | 0 | [85, 207] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_214319__409.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 283 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 4 | 20231224_214323__877 | 0 | 0.0 | 4.52794 | 0 | [85, 138] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_214323__877.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 284 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 4 | 20231226_204611__837 | 0 | 0.0 | 7.6741 | 0 | [85, 242] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_204611__837.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 285 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231213_230336__999 | 0 | 0.0 | 7.70955 | 0 | [112, 217] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231213_230336__999.json | 25.0 | missing | missing | missing | |
| 286 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214309__839 | 0 | 0.0 | 2.98036 | 0 | [127, 80] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_214309__839.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 287 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214312__147 | 0 | 0.0 | 3.011 | 0 | [127, 82] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_214312__147.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 288 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_204604__802 | 0 | 0.0 | 6.27106 | 0 | [127, 190] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_204604__802.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 289 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_230329__973 | 0 | 0.0 | 20.1513 | 0 | [239, 520] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231213_230329__973.json | 50.0 | missing | missing | missing | |
| 290 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_214246__900 | 0 | 0.0 | 25.225 | 0 | [254, 599] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214246__900.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 291 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_214306__896 | 1 | 0.0 | 19.6735 | 1 | [254, 591] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214306__896.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 292 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_204557__171 | 1 | 0.0 | 21.1393 | 1 | [254, 467] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_204557__171.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 293 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_230444__378 | 0 | 0.0 | 15.1027 | 0 | [11, 410] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231213_230444__378.json | 50.0 | missing | missing | missing | |
| 294 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_214422__875 | 1 | 0.0 | 13.5487 | 1 | [394, 373] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214422__875.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 295 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_214441__872 | 1 | 0.0 | 18.7365 | 1 | [394, 531] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214441__872.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 296 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_204649__756 | 0 | 0.0 | 22.1708 | 0 | [394, 632] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_204649__756.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 297 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_230429__150 | 0 | 0.0 | 19.049 | 0 | [383, 429] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231213_230429__150.json | 50.0 | missing | missing | missing | |
| 298 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_214402__954 | 0 | 0.0 | 11.0613 | 0 | [392, 295] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_214402__954.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 299 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_214408__963 | 0 | 0.0 | 6.54506 | 0 | [392, 152] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_214408__963.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 300 | Apple-MacBook-Pro-M1 | add_yearmonth | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 4 | 20231226_204626__845 | 0 | 0.0 | 14.8077 | 0 | [392, 410] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_204626__845.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 301 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | InJulia | 1SHOT | true | false | 4 | 20231213_231431__567 | 0 | 0.0 | 12.9547 | 0 | [82, 383] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__InJulia__1SHOT__20231213_231431__567.json | 25.0 | missing | missing | missing | |
| 302 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | InJulia | 1SHOT | true | true | 4 | 20231224_220602__831 | 0 | 0.0 | 7.32448 | 0 | [85, 125] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__InJulia__1SHOT__20231224_220602__831.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 303 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | InJulia | 1SHOT | false | false | 4 | 20231224_220605__366 | 0 | 0.0 | 2.8498 | 0 | [85, 39] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__InJulia__1SHOT__20231224_220605__366.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 304 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | InJulia | 1SHOT | true | true | 4 | 20231226_205626__600 | 0 | 0.0 | 9.0947 | 0 | [85, 159] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__InJulia__1SHOT__20231226_205626__600.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 305 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231213_231418__396 | 0 | 0.0 | 6.75085 | 0 | [112, 188] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231213_231418__396.json | 25.0 | missing | missing | missing | |
| 306 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_220534__596 | 0 | 0.0 | 17.8413 | 0 | [125, 316] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231224_220534__596.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 307 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_220554__427 | 0 | 0.0 | 20.4519 | 0 | [125, 363] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231224_220554__427.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 308 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_205616__701 | 0 | 0.0 | 3.43588 | 0 | [125, 46] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_205616__701.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 309 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_231412__836 | 0 | 0.0 | 14.2497 | 0 | [239, 361] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231412__836.json | 50.0 | missing | missing | missing | |
| 310 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_220511__304 | 0 | 0.0 | 20.7275 | 0 | [252, 167] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220511__304.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 311 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_220516__617 | 0 | 0.0 | 5.41993 | 0 | [252, 63] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220516__617.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 312 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_205613__710 | 0 | 0.0 | 26.0848 | 0 | [252, 258] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_205613__710.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 313 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231213_231529__668 | 0 | 0.0 | 19.5313 | 0 | [11, 528] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231529__668.json | 25.0 | missing | missing | missing | |
| 314 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_220741__118 | 0 | 0.0 | 17.4291 | 0 | [389, 255] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220741__118.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 315 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_220846__645 | 0 | 0.0 | 64.9873 | 0 | [389, 1045] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220846__645.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 316 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_205714__445 | 0 | 0.0 | 15.0533 | 0 | [389, 214] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_205714__445.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 317 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 4 | 20231213_231509__775 | 0 | 0.0 | 26.2222 | 0 | [383, 614] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231213_231509__775.json | 25.0 | missing | missing | missing | |
| 318 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220658__519 | 0 | 0.0 | 23.1072 | 0 | [386, 355] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231224_220658__519.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 319 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220724__632 | 0 | 0.0 | 25.2552 | 0 | [386, 392] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231224_220724__632.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 320 | Apple-MacBook-Pro-M1 | add_yearmonth | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_205659__326 | 0 | 0.0 | 33.3249 | 0 | [386, 533] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_205659__326.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 321 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 4 | 20231219_211727__307 | 0 | 0.0 | 9.12611 | 0 | [1, 288] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_211727__307.json | 25.0 | missing | missing | missing | |
| 322 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 4 | 20231219_211747__489 | 0 | 0.0 | 20.0043 | 0 | [1, 599] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_211747__489.json | 25.0 | missing | missing | missing | |
| 323 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | true | 4 | 20231224_223744__581 | 0 | 0.0 | 4.1468 | 0 | [79, 157] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231224_223744__581.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 324 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 4 | 20231224_223809__375 | 0 | 0.0 | 24.6085 | 0 | [79, 909] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231224_223809__375.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 325 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 4 | 20231226_211146__407 | 0 | 0.0 | 12.8638 | 0 | [79, 491] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231226_211146__407.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 326 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231219_211651__207 | 0 | 0.0 | 10.9584 | 0 | [1, 338] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_211651__207.json | 25.0 | missing | missing | missing | |
| 327 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231219_211657__785 | 0 | 0.0 | 5.71202 | 0 | [1, 181] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_211657__785.json | 25.0 | missing | missing | missing | |
| 328 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_223736__319 | 0 | 0.0 | 30.8922 | 0 | [116, 1108] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231224_223736__319.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 329 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_223740__779 | 0 | 0.0 | 4.63229 | 0 | [116, 172] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231224_223740__779.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 330 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_211133__320 | 0 | 0.0 | 20.2041 | 0 | [116, 747] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_211133__320.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 331 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231219_211612__279 | 0 | 0.0 | 11.5734 | 0 | [1, 342] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_211612__279.json | 25.0 | missing | missing | missing | |
| 332 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231219_211633__586 | 0 | 0.0 | 20.5712 | 0 | [1, 584] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_211633__586.json | 25.0 | missing | missing | missing | |
| 333 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_223702__671 | 0 | 0.0 | 5.11684 | 0 | [228, 28] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223702__671.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 334 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_223705__847 | 0 | 0.0 | 2.44581 | 0 | [228, 68] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223705__847.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 335 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_211113__501 | 0 | 0.0 | 7.95007 | 0 | [228, 147] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211113__501.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 336 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231219_211943__286 | 0 | 0.0 | 15.8221 | 0 | [1, 440] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_211943__286.json | 25.0 | missing | missing | missing | |
| 337 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231219_212000__721 | 0 | 0.0 | 16.7187 | 0 | [1, 463] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_212000__721.json | 25.0 | missing | missing | missing | |
| 338 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_223854__425 | 0 | 0.0 | 6.44599 | 0 | [368, 199] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223854__425.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 339 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_223919__845 | 0 | 0.0 | 24.8188 | 0 | [368, 837] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223919__845.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 340 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_211159__406 | 0 | 0.0 | 12.0725 | 0 | [368, 402] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211159__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 341 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 4 | 20231219_211855__981 | 0 | 0.0 | 20.4616 | 0 | [1, 559] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_211855__981.json | 25.0 | missing | missing | missing | |
| 342 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 4 | 20231219_211911__495 | 0 | 0.0 | 15.5511 | 0 | [1, 433] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_211911__495.json | 25.0 | missing | missing | missing | |
| 343 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_223824__986 | 0 | 0.0 | 6.9273 | 0 | [365, 212] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231224_223824__986.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 344 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_223847__876 | 0 | 0.0 | 23.2036 | 0 | [365, 783] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231224_223847__876.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 345 | Apple-MacBook-Pro-M1 | add_yearmonth | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 4 | 20231226_211147__233 | 0 | 0.0 | 1.23476 | 0 | [365, 1] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_211147__233.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 346 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 4 | 20231213_231609__695 | 0 | 0.0 | 15.638 | 0 | [82, 460] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231213_231609__695.json | 25.0 | missing | missing | missing | |
| 347 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 4 | 20231224_221158__254 | 0 | 0.0 | 38.6556 | 0 | [93, 297] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231224_221158__254.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 348 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 4 | 20231224_221307__198 | 0 | 0.0 | 68.6903 | 0 | [93, 531] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231224_221307__198.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 349 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 4 | 20231226_205944__535 | 0 | 0.0 | 41.8397 | 0 | [93, 319] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_205944__535.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 350 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_231553__913 | 0 | 0.0 | 7.78618 | 0 | [112, 219] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231213_231553__913.json | 50.0 | missing | missing | missing | |
| 351 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_221033__344 | 1 | 0.0 | 12.0045 | 1 | [133, 72] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231224_221033__344.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 352 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_221119__394 | 1 | 0.0 | 45.5381 | 3 | [133, 341] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231224_221119__394.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 353 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_205901__525 | 1 | 0.0 | 39.9638 | 1 | [133, 297] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_205901__525.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 354 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231213_231545__744 | 0 | 0.0 | 16.2576 | 0 | [239, 416] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231545__744.json | 25.0 | missing | missing | missing | |
| 355 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_220950__612 | 1 | 0.0 | 63.9363 | 4 | [260, 282] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220950__612.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 356 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_221021__792 | 1 | 0.0 | 30.5228 | 4 | [260, 198] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_221021__792.json | 81.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 357 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_205821__855 | 0 | 0.0 | 66.9719 | 0 | [260, 322] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_205821__855.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 358 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231213_231708__422 | 0 | 0.0 | 21.9283 | 0 | [11, 587] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231708__422.json | 0.0 | missing | missing | missing | |
| 359 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231224_221730__174 | 0 | 0.0 | 47.3913 | 0 | [397, 305] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_221730__174.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 360 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_221757__868 | 1 | 0.0 | 26.3479 | 3 | [397, 141] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_221757__868.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 361 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_210127__779 | 1 | 0.0 | 46.201 | 3 | [397, 296] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_210127__779.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 362 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 4 | 20231213_231646__454 | 0 | 0.0 | 22.4059 | 0 | [383, 518] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231213_231646__454.json | 25.0 | missing | missing | missing | |
| 363 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_221538__112 | 0 | 0.0 | 71.8218 | 0 | [394, 492] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231224_221538__112.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 364 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_221642__393 | 0 | 0.0 | 63.9471 | 0 | [394, 432] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231224_221642__393.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 365 | Apple-MacBook-Pro-M1 | add_yearmonth | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 4 | 20231226_210041__395 | 0 | 0.0 | 56.9533 | 0 | [394, 379] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_210041__395.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 366 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231219_210350__149 | 0 | 0.0 | 11.3669 | 0 | [1, 355] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_210350__149.json | 25.0 | missing | missing | missing | |
| 367 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231219_210401__529 | 0 | 0.0 | 10.5193 | 0 | [1, 330] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_210401__529.json | 25.0 | missing | missing | missing | |
| 368 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 4 | 20231224_223004__851 | 0 | 0.0 | 16.6118 | 0 | [87, 279] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231224_223004__851.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 369 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231224_223015__980 | 1 | 0.0 | 10.2473 | 1 | [87, 168] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231224_223015__980.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 370 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 4 | 20231226_210750__459 | 1 | 0.0 | 15.9249 | 1 | [87, 267] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_210750__459.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 371 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231219_210317__743 | 0 | 0.0 | 5.79113 | 0 | [1, 184] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_210317__743.json | 25.0 | missing | missing | missing | |
| 372 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231219_210326__706 | 0 | 0.0 | 8.97485 | 0 | [1, 280] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_210326__706.json | 25.0 | missing | missing | missing | |
| 373 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_222931__462 | 0 | 0.0 | 12.466 | 0 | [129, 198] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_222931__462.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 374 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_222948__793 | 0 | 0.0 | 17.0255 | 0 | [129, 277] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_222948__793.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 375 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231226_210734__789 | 0 | 0.0 | 12.1807 | 0 | [129, 193] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_210734__789.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 376 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231219_210253__101 | 0 | 0.0 | 16.9536 | 0 | [1, 489] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_210253__101.json | 25.0 | missing | missing | missing | |
| 377 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231219_210305__725 | 0 | 0.0 | 11.3949 | 0 | [1, 337] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_210305__725.json | 0.0 | missing | missing | missing | |
| 378 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231224_222859__310 | 0 | 0.0 | 34.2145 | 0 | [256, 388] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_222859__310.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 379 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_222918__565 | 0 | 0.0 | 18.6582 | 0 | [256, 288] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_222918__565.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 380 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_210721__583 | 0 | 0.0 | 36.5823 | 0 | [256, 444] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_210721__583.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 381 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231219_210626__363 | 0 | 0.0 | 22.3206 | 0 | [1, 605] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_210626__363.json | 25.0 | missing | missing | missing | |
| 382 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231219_210645__520 | 0 | 0.0 | 19.6091 | 0 | [1, 537] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_210645__520.json | 25.0 | missing | missing | missing | |
| 383 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_223201__488 | 0 | 0.0 | 19.0267 | 0 | [396, 269] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223201__488.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 384 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_223220__772 | 0 | 0.0 | 18.4085 | 0 | [396, 259] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_223220__772.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 385 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_210835__621 | 0 | 0.0 | 18.8388 | 0 | [396, 266] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_210835__621.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 386 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20231219_210527__289 | 0 | 0.0 | 23.5755 | 0 | [1, 636] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_210527__289.json | 0.0 | missing | missing | missing | |
| 387 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20231219_210539__226 | 0 | 0.0 | 11.8552 | 0 | [1, 335] | 0.5.0-DEV | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_210539__226.json | 0.0 | missing | missing | missing | |
| 388 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_223115__169 | 0 | 0.0 | 26.9897 | 0 | [394, 401] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223115__169.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 389 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 4 | 20231224_223142__961 | 0 | 0.0 | 27.266 | 0 | [394, 406] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_223142__961.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 390 | Apple-MacBook-Pro-M1 | add_yearmonth | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_210816__340 | 0 | 0.0 | 26.5701 | 0 | [394, 394] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_210816__340.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 391 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | InJulia | 1SHOT | true | false | 4 | 20231213_231254__188 | 0 | 0.0 | 10.1145 | 0 | [82, 299] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__InJulia__1SHOT__20231213_231254__188.json | 25.0 | missing | missing | missing | |
| 392 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | InJulia | 1SHOT | true | true | 4 | 20231224_220404__640 | 0 | 0.0 | 5.89868 | 0 | [88, 333] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__InJulia__1SHOT__20231224_220404__640.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 393 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | InJulia | 1SHOT | false | false | 4 | 20231224_220412__706 | 0 | 0.0 | 7.52091 | 0 | [88, 422] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__InJulia__1SHOT__20231224_220412__706.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 394 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | InJulia | 1SHOT | true | true | 4 | 20231226_205530__757 | 0 | 0.0 | 5.22782 | 0 | [88, 295] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_205530__757.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 395 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_231244__318 | 0 | 0.0 | 7.07503 | 0 | [112, 197] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231213_231244__318.json | 50.0 | missing | missing | missing | |
| 396 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_220353__403 | 0 | 0.0 | 4.95009 | 0 | [125, 273] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231224_220353__403.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 397 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 4 | 20231224_220359__789 | 0 | 0.0 | 5.2296 | 0 | [125, 288] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231224_220359__789.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 398 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_205524__721 | 0 | 0.0 | 1.82697 | 0 | [125, 92] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_205524__721.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 399 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231213_231237__298 | 0 | 0.0 | 20.17 | 0 | [239, 522] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231237__298.json | 50.0 | missing | missing | missing | |
| 400 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_220339__304 | 0 | 0.0 | 15.5642 | 0 | [232, 646] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220339__304.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 401 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231224_220348__593 | 0 | 0.0 | 8.72532 | 0 | [232, 445] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231224_220348__593.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 402 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_205523__760 | 0 | 0.0 | 9.48193 | 0 | [232, 348] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_205523__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 403 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231213_231357__110 | 0 | 0.0 | 19.6635 | 0 | [11, 531] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231357__110.json | 0.0 | missing | missing | missing | |
| 404 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231224_220440__339 | 0 | 0.0 | 5.62358 | 0 | [375, 247] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220440__339.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 405 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231224_220450__615 | 0 | 0.0 | 9.6912 | 0 | [375, 454] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231224_220450__615.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 406 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231226_205547__151 | 0 | 0.0 | 10.674 | 0 | [375, 502] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_205547__151.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 407 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_231337__268 | 0 | 0.0 | 24.209 | 0 | [383, 563] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231213_231337__268.json | 50.0 | missing | missing | missing | |
| 408 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220429__737 | 0 | 0.0 | 6.85357 | 0 | [373, 311] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231224_220429__737.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 409 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 4 | 20231224_220435__549 | 0 | 0.0 | 5.6292 | 0 | [373, 248] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231224_220435__549.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 410 | Apple-MacBook-Pro-M1 | add_yearmonth | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 4 | 20231226_205536__902 | 0 | 0.0 | 6.42983 | 0 | [373, 289] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_205536__902.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 411 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | InJulia | 1SHOT | true | false | 4 | 20231213_230516__550 | 0 | 0.0 | 10.7484 | 0 | [82, 319] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__InJulia__1SHOT__20231213_230516__550.json | 25.0 | missing | missing | missing | |
| 412 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | InJulia | 1SHOT | true | true | 4 | 20231224_214559__800 | 0 | 0.0 | 7.97435 | 0 | [87, 251] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_214559__800.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 413 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | InJulia | 1SHOT | true | true | 4 | 20231224_214604__293 | 0 | 0.0 | 5.46209 | 0 | [87, 169] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_214604__293.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 414 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | InJulia | 1SHOT | true | true | 4 | 20231226_204732__983 | 0 | 0.0 | 13.1953 | 0 | [87, 421] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_204732__983.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 415 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | false | 4 | 20231213_230506__910 | 0 | 0.0 | 6.41789 | 0 | [112, 178] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231213_230506__910.json | 25.0 | missing | missing | missing | |
| 416 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214544__136 | 0 | 0.0 | 15.4212 | 0 | [129, 483] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_214544__136.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 417 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214551__136 | 0 | 0.0 | 6.49753 | 0 | [129, 193] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_214551__136.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 418 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_204718__917 | 0 | 0.0 | 8.15732 | 0 | [129, 248] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_204718__917.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 419 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231213_230459__319 | 0 | 0.0 | 15.1655 | 0 | [239, 385] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231213_230459__319.json | 0.0 | missing | missing | missing | |
| 420 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_214512__971 | 0 | 0.0 | 31.1972 | 1 | [256, 768] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214512__971.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 421 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231224_214528__433 | 0 | 0.0 | 15.5994 | 0 | [256, 466] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214528__433.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 422 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 4 | 20231226_204710__179 | 0 | 0.0 | 21.5997 | 0 | [256, 486] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_204710__179.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 423 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 4 | 20231213_230604__469 | 0 | 0.0 | 11.7843 | 0 | [11, 327] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231213_230604__469.json | 0.0 | missing | missing | missing | |
| 424 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_214704__595 | 0 | 0.0 | 15.7235 | 0 | [396, 439] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214704__595.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 425 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_214712__559 | 0 | 0.0 | 7.48527 | 1 | [396, 182] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_214712__559.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 426 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231226_204803__262 | 1 | 0.0 | 19.9672 | 1 | [396, 566] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_204803__262.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 427 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20231213_230552__351 | 0 | 0.0 | 22.6418 | 0 | [383, 526] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231213_230552__351.json | 50.0 | missing | missing | missing | |
| 428 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_214634__687 | 0 | 0.0 | 13.5355 | 0 | [394, 372] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_214634__687.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 429 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_214648__650 | 0 | 0.0 | 14.2957 | 0 | [394, 395] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_214648__650.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 430 | Apple-MacBook-Pro-M1 | add_yearmonth | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 4 | 20231226_204743__236 | 0 | 0.0 | 10.3734 | 0 | [394, 272] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_204743__236.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 431 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | InJulia | 1SHOT | true | false | 4 | 20231213_230635__773 | 0 | 0.0 | 17.8071 | 0 | [82, 522] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__InJulia__1SHOT__20231213_230635__773.json | 25.0 | missing | missing | missing | |
| 432 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | InJulia | 1SHOT | true | false | 4 | 20231224_214941__941 | 0 | 0.0 | 36.2942 | 0 | [83, 268] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_214941__941.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 433 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | InJulia | 1SHOT | true | true | 4 | 20231224_215118__499 | 0 | 0.0 | 96.6047 | 0 | [83, 719] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_215118__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 434 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | InJulia | 1SHOT | true | false | 4 | 20231226_204956__298 | 0 | 0.0 | 41.0559 | 0 | [83, 302] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_204956__298.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 435 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231213_230617__423 | 0 | 0.0 | 5.68054 | 0 | [112, 154] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231213_230617__423.json | 50.0 | missing | missing | missing | |
| 436 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214840__705 | 0 | 0.0 | 9.99507 | 0 | [122, 58] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_214840__705.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 437 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231224_214904__521 | 1 | 0.0 | 24.0301 | 1 | [122, 167] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_214904__521.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 438 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 4 | 20231226_204914__195 | 1 | 0.0 | 9.22783 | 1 | [122, 52] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_204914__195.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 439 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231213_230612__174 | 0 | 0.0 | 7.55765 | 0 | [239, 171] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231213_230612__174.json | 25.0 | missing | missing | missing | |
| 440 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 4 | 20231224_214806__891 | 0 | 0.0 | 53.8569 | 0 | [248, 159] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214806__891.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 441 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231224_214830__407 | 0 | 0.0 | 24.4047 | 0 | [248, 148] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_214830__407.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 442 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 4 | 20231226_204905__349 | 0 | 0.0 | 62.1504 | 0 | [248, 242] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_204905__349.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 443 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231213_230726__170 | 0 | 0.0 | 19.3507 | 0 | [11, 519] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231213_230726__170.json | 50.0 | missing | missing | missing | |
| 444 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 4 | 20231224_215516__990 | 0 | 0.0 | 37.8541 | 0 | [396, 221] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231224_215516__990.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 445 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231224_215614__189 | 0 | 0.0 | 57.4387 | 0 | [396, 365] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231224_215614__189.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 446 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 4 | 20231226_205220__656 | 0 | 0.0 | 52.9814 | 0 | [396, 333] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_205220__656.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 447 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 4 | 20231213_230707__428 | 0 | 0.0 | 18.6963 | 0 | [383, 424] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231213_230707__428.json | 0.0 | missing | missing | missing | |
| 448 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 4 | 20231224_215356__677 | 0 | 0.0 | 53.3473 | 0 | [394, 336] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_215356__677.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 449 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 4 | 20231224_215438__237 | 1 | 0.0 | 41.6787 | 1 | [394, 250] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_215438__237.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 450 | Apple-MacBook-Pro-M1 | add_yearmonth | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 4 | 20231226_205127__551 | 0 | 0.0 | 90.4767 | 0 | [394, 600] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/add_yearmonth/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_205127__551.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 451 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231213_232514__958 | 0 | 0.0 | 13.319 | 0 | [112, 384] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231213_232514__958.json | 25.0 | missing | missing | missing | |
| 452 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231224_225931__253 | 0 | 0.0 | 13.5563 | 0 | [120, 238] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231224_225931__253.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 453 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231224_225944__362 | 0 | 0.0 | 12.4507 | 0 | [120, 217] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231224_225944__362.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 454 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231226_212420__786 | 0 | 0.0 | 15.4032 | 0 | [120, 261] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_212420__786.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 455 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_232501__569 | 0 | 0.0 | 11.7851 | 0 | [141, 328] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231213_232501__569.json | 25.0 | missing | missing | missing | |
| 456 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_225911__412 | 5 | 0.0 | 13.3111 | 2 | [158, 227] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231224_225911__412.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 457 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_225917__408 | 0 | 0.0 | 5.56346 | 0 | [158, 81] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231224_225917__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 458 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_212404__496 | 0 | 0.0 | 13.511 | 0 | [158, 217] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_212404__496.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 459 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_232449__119 | 0 | 0.0 | 15.3734 | 0 | [311, 364] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231213_232449__119.json | 0.0 | missing | missing | missing | |
| 460 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_225851__500 | 5 | 0.0 | 24.2962 | 2 | [329, 212] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231224_225851__500.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 461 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_225858__277 | 0 | 0.0 | 6.74498 | 0 | [329, 73] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231224_225858__277.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 462 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_212350__450 | 0 | 0.0 | 21.2614 | 0 | [329, 161] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212350__450.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 463 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_232613__120 | 0 | 0.0 | 20.392 | 0 | [11, 544] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231213_232613__120.json | 50.0 | missing | missing | missing | |
| 464 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_230133__762 | 0 | 0.0 | 36.4796 | 0 | [423, 579] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230133__762.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 465 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_230157__871 | 0 | 0.0 | 22.7769 | 0 | [423, 345] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230157__871.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 466 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_212512__303 | 0 | 0.0 | 16.2021 | 0 | [423, 220] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_212512__303.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 467 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_232553__613 | 0 | 0.0 | 20.2837 | 0 | [412, 450] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231213_232553__613.json | 50.0 | missing | missing | missing | |
| 468 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_230032__825 | 0 | 0.0 | 28.5827 | 0 | [420, 446] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231224_230032__825.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 469 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_230057__213 | 0 | 0.0 | 24.8535 | 0 | [420, 382] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231224_230057__213.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 470 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_212456__774 | 0 | 0.0 | 35.5977 | 0 | [420, 545] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_212456__774.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 471 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231213_232656__691 | 0 | 0.0 | 12.9591 | 0 | [112, 373] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__InJulia__1SHOT__20231213_232656__691.json | 0.0 | missing | missing | missing | |
| 472 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231224_230310__987 | 0 | 0.0 | 19.9744 | 0 | [94, 362] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__InJulia__1SHOT__20231224_230310__987.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 473 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231224_230323__346 | 0 | 0.0 | 13.7482 | 0 | [94, 246] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__InJulia__1SHOT__20231224_230323__346.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 474 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_232643__163 | 0 | 0.0 | 5.08613 | 0 | [141, 128] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231213_232643__163.json | 0.0 | missing | missing | missing | |
| 475 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_230243__752 | 0 | 0.0 | 29.3851 | 0 | [95, 533] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231224_230243__752.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 476 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_230250__297 | 0 | 0.0 | 6.651 | 0 | [95, 112] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231224_230250__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 477 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_232638__835 | 0 | 0.0 | 24.3532 | 0 | [311, 598] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231213_232638__835.json | 25.0 | missing | missing | missing | |
| 478 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_230212__242 | 0 | 0.0 | 15.0038 | 0 | [204, 58] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230212__242.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 479 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_232753__165 | 0 | 0.0 | 20.4715 | 0 | [11, 546] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231213_232753__165.json | 25.0 | missing | missing | missing | |
| 480 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_230551__638 | 0 | 0.0 | 8.0979 | 0 | [112, 135] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230551__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 481 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_230553__168 | 0 | 0.0 | 2.29263 | 0 | [112, 24] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230553__168.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 482 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_232732__718 | 0 | 0.0 | 26.2838 | 0 | [412, 600] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231213_232732__718.json | 50.0 | missing | missing | missing | |
| 483 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_230535__473 | 0 | 0.0 | 24.9075 | 0 | [109, 447] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231224_230535__473.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 484 | Apple-MacBook-Pro-M1 | audi_filter | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_230543__850 | 0 | 0.0 | 8.02735 | 0 | [109, 134] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231224_230543__850.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 485 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_212201__368 | 0 | 0.0 | 20.4052 | 0 | [1, 604] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_212201__368.json | 0.0 | missing | missing | missing | |
| 486 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_212216__687 | 0 | 0.0 | 14.3885 | 0 | [1, 438] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_212216__687.json | 0.0 | missing | missing | missing | |
| 487 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231224_232702__977 | 5 | 0.0 | 40.8284 | 2 | [118, 232] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_232702__977.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 488 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231224_232806__261 | 0 | 0.0 | 63.7885 | 0 | [118, 379] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_232806__261.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 489 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231226_213533__400 | 0 | 0.0 | 45.2163 | 0 | [118, 266] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_213533__400.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 490 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_212117__970 | 0 | 0.0 | 8.90231 | 0 | [1, 275] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_212117__970.json | 25.0 | missing | missing | missing | |
| 491 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_212119__143 | 0 | 0.0 | 2.48083 | 0 | [1, 79] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_212119__143.json | 25.0 | missing | missing | missing | |
| 492 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_232544__252 | 0 | 0.0 | 52.4488 | 0 | [159, 302] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_232544__252.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 493 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_232620__190 | 2 | 0.0 | 35.3575 | 2 | [159, 196] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_232620__190.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 494 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_213448__251 | 0 | 0.0 | 54.4396 | 0 | [159, 317] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_213448__251.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 495 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_212039__932 | 0 | 0.0 | 20.6687 | 0 | [1, 574] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_212039__932.json | 25.0 | missing | missing | missing | |
| 496 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_212054__361 | 0 | 0.0 | 15.5367 | 0 | [1, 441] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_212054__361.json | 25.0 | missing | missing | missing | |
| 497 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_232411__590 | 5 | 0.0 | 78.5092 | 2 | [334, 259] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_232411__590.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 498 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_232452__732 | 0 | 0.0 | 39.4602 | 0 | [334, 187] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_232452__732.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 499 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_213353__826 | 2 | 0.0 | 80.9641 | 2 | [334, 297] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_213353__826.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 500 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_212458__605 | 0 | 0.0 | 16.2798 | 0 | [1, 448] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_212458__605.json | 25.0 | missing | missing | missing | |
| 501 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_212518__448 | 0 | 0.0 | 20.5976 | 0 | [1, 557] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_212518__448.json | 25.0 | missing | missing | missing | |
| 502 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_233230__523 | 0 | 0.0 | 45.6835 | 0 | [447, 211] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233230__523.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 503 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_233257__276 | 0 | 0.0 | 26.0678 | 0 | [447, 92] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233257__276.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 504 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_213758__264 | 0 | 0.0 | 55.1589 | 0 | [447, 269] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_213758__264.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 505 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_212353__424 | 0 | 0.0 | 36.2872 | 0 | [1, 929] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_212353__424.json | 25.0 | missing | missing | missing | |
| 506 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_212419__977 | 0 | 0.0 | 26.3818 | 0 | [1, 699] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_212419__977.json | 25.0 | missing | missing | missing | |
| 507 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_233040__885 | 0 | 0.0 | 66.7025 | 0 | [445, 335] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233040__885.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 508 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_233145__958 | 0 | 0.0 | 63.1867 | 0 | [445, 312] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233145__958.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 509 | Apple-MacBook-Pro-M1 | audi_filter | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_213702__163 | 0 | 0.0 | 89.187 | 0 | [445, 471] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_213702__163.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 510 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_215357__322 | 0 | 0.0 | 7.71196 | 0 | [118, 291] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_215357__322.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 511 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_101255__833 | 0 | 0.0 | 7.57254 | 0 | [118, 286] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_101255__833.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 512 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_101303__494 | 0 | 0.0 | 8.00399 | 0 | [118, 302] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_101303__494.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 513 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_101318__509 | 0 | 0.0 | 14.7564 | 0 | [118, 553] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_101318__509.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 514 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_215349__385 | 0 | 0.0 | 7.69335 | 0 | [155, 284] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_215349__385.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 515 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_101235__876 | 0 | 0.0 | 8.44283 | 0 | [155, 313] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_101235__876.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 516 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_101240__238 | 0 | 0.0 | 5.3209 | 0 | [155, 193] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_101240__238.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 517 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_101248__797 | 0 | 0.0 | 7.0956 | 0 | [155, 261] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_101248__797.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 518 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_215341__511 | 0 | 0.0 | 9.86017 | 0 | [324, 203] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215341__511.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 519 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_101216__707 | 0 | 0.0 | 8.35944 | 0 | [324, 154] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101216__707.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 520 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_101223__171 | 0 | 0.0 | 7.45848 | 0 | [324, 243] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101223__171.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 521 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_101226__573 | 0 | 0.0 | 3.39898 | 0 | [324, 90] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101226__573.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 522 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_215416__965 | 0 | 0.0 | 3.4585 | 0 | [407, 80] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_215416__965.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 523 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_101342__656 | 0 | 0.0 | 10.342 | 0 | [407, 332] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_101342__656.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 524 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_101350__349 | 0 | 0.0 | 8.5885 | 0 | [407, 269] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_101350__349.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 525 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_101401__991 | 0 | 0.0 | 10.189 | 0 | [407, 327] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_101401__991.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 526 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_215412__503 | 0 | 0.0 | 15.3586 | 0 | [404, 508] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_215412__503.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 527 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_101322__823 | 0 | 0.0 | 3.80351 | 0 | [404, 93] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_101322__823.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 528 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_101327__304 | 0 | 0.0 | 5.19129 | 0 | [404, 145] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_101327__304.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 529 | Apple-MacBook-Pro-M1 | audi_filter | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_101331__951 | 0 | 0.0 | 4.44021 | 0 | [404, 117] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_101331__951.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 530 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | InJulia | 1SHOT | true | false | 5 | 20231213_231800__199 | 0 | 0.0 | 14.3483 | 0 | [112, 413] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__InJulia__1SHOT__20231213_231800__199.json | 25.0 | missing | missing | missing | |
| 531 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | InJulia | 1SHOT | false | false | 5 | 20231224_224047__802 | 0 | 0.0 | 24.8359 | 0 | [112, 698] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__InJulia__1SHOT__20231224_224047__802.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 532 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | InJulia | 1SHOT | true | false | 5 | 20231224_224112__809 | 0 | 0.0 | 24.7241 | 0 | [1, 716] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__InJulia__1SHOT__20231224_224112__809.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 533 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | InJulia | 1SHOT | true | false | 5 | 20231226_211603__901 | 0 | 0.0 | 12.3288 | 0 | [112, 361] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__InJulia__1SHOT__20231226_211603__901.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 534 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_231745__966 | 0 | 0.0 | 14.0635 | 0 | [141, 393] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertAsk__1SHOT__20231213_231745__966.json | 0.0 | missing | missing | missing | |
| 535 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224015__818 | 0 | 0.0 | 11.0828 | 0 | [141, 308] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_224015__818.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 536 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224022__531 | 0 | 0.0 | 6.81989 | 0 | [1, 211] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_224022__531.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 537 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_211551__635 | 0 | 0.0 | 17.1247 | 0 | [141, 486] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_211551__635.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 538 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_231731__185 | 0 | 0.0 | 23.1192 | 0 | [311, 567] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231731__185.json | 25.0 | missing | missing | missing | |
| 539 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_223947__859 | 0 | 0.0 | 28.0687 | 0 | [329, 545] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_223947__859.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 540 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_224004__559 | 0 | 0.0 | 17.1974 | 0 | [1, 483] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224004__559.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 541 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_211534__533 | 0 | 0.0 | 15.8703 | 0 | [329, 240] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211534__533.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 542 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_231858__331 | 0 | 0.0 | 18.5963 | 0 | [11, 500] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_231858__331.json | 25.0 | missing | missing | missing | |
| 543 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_224237__697 | 0 | 0.0 | 17.3283 | 0 | [11, 468] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224237__697.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 544 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_224303__188 | 3 | 0.0 | 25.7189 | 2 | [1, 680] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224303__188.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 545 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_211658__605 | 0 | 0.0 | 21.9796 | 0 | [11, 593] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211658__605.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 546 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_231839__652 | 0 | 0.0 | 21.487 | 0 | [412, 480] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapTask__1SHOT__20231213_231839__652.json | 0.0 | missing | missing | missing | |
| 547 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_224205__630 | 0 | 0.0 | 22.6624 | 0 | [412, 511] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_224205__630.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 548 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_224220__662 | 0 | 0.0 | 14.9063 | 0 | [1, 412] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_224220__662.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 549 | Apple-MacBook-Pro-M1 | audi_filter | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_211636__533 | 0 | 0.0 | 32.2092 | 0 | [412, 754] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_211636__533.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 550 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | InJulia | 1SHOT | false | false | 5 | 20231213_232850__881 | 0 | 0.0 | 24.6754 | 0 | [112, 694] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__InJulia__1SHOT__20231213_232850__881.json | 0.0 | missing | missing | missing | |
| 551 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | InJulia | 1SHOT | true | true | 5 | 20231224_230642__886 | 0 | 0.0 | 9.19222 | 1 | [112, 293] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__InJulia__1SHOT__20231224_230642__886.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 552 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | InJulia | 1SHOT | true | true | 5 | 20231224_230651__560 | 0 | 0.0 | 8.54393 | 0 | [112, 270] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__InJulia__1SHOT__20231224_230651__560.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 553 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | InJulia | 1SHOT | true | true | 5 | 20231226_212545__133 | 0 | 0.0 | 6.49587 | 0 | [112, 203] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__InJulia__1SHOT__20231226_212545__133.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 554 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_232826__278 | 0 | 0.0 | 16.8748 | 0 | [141, 471] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231213_232826__278.json | 50.0 | missing | missing | missing | |
| 555 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231224_230627__531 | 0 | 0.0 | 8.29796 | 0 | [151, 258] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231224_230627__531.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 556 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_230633__374 | 0 | 0.0 | 5.77436 | 0 | [151, 173] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231224_230633__374.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 557 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_212538__236 | 0 | 0.0 | 9.45083 | 0 | [151, 296] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_212538__236.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 558 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_232809__845 | 0 | 0.0 | 15.818 | 0 | [311, 376] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231213_232809__845.json | 0.0 | missing | missing | missing | |
| 559 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_230609__993 | 0 | 0.0 | 15.8305 | 0 | [321, 275] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230609__993.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 560 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_230619__293 | 0 | 0.0 | 9.58397 | 0 | [321, 266] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230619__293.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 561 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_212528__901 | 0 | 0.0 | 15.7137 | 0 | [321, 278] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212528__901.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 562 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_233007__616 | 0 | 0.0 | 26.9342 | 0 | [11, 704] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233007__616.json | 25.0 | missing | missing | missing | |
| 563 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_230741__371 | 0 | 0.0 | 9.64082 | 0 | [415, 254] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230741__371.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 564 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_230752__484 | 0 | 0.0 | 10.9637 | 0 | [415, 295] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230752__484.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 565 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_212604__890 | 0 | 0.0 | 10.6755 | 0 | [415, 285] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_212604__890.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 566 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_232941__632 | 0 | 0.0 | 29.5601 | 0 | [412, 680] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapTask__1SHOT__20231213_232941__632.json | 50.0 | missing | missing | missing | |
| 567 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_230719__699 | 0 | 0.0 | 11.8408 | 0 | [412, 321] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapTask__1SHOT__20231224_230719__699.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 568 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_230730__136 | 0 | 0.0 | 11.6385 | 0 | [412, 318] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapTask__1SHOT__20231224_230730__136.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 569 | Apple-MacBook-Pro-M1 | audi_filter | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_212553__387 | 0 | 0.0 | 8.28417 | 0 | [412, 212] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_212553__387.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 570 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_174702__416 | 0 | 0.0 | 11.2576 | 0 | [112, 211] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174702__416.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 571 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_174714__635 | 0 | 0.0 | 11.0988 | 0 | [112, 208] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174714__635.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 572 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_174727__264 | 0 | 0.0 | 12.809 | 0 | [112, 238] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_174727__264.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 573 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_174621__733 | 4 | 0.0 | 11.5945 | 2 | [151, 213] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174621__733.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 574 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_174638__972 | 0 | 0.0 | 16.5895 | 0 | [151, 313] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174638__972.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 575 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_174651__317 | 0 | 0.0 | 12.6341 | 1 | [151, 233] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_174651__317.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 576 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_174529__689 | 0 | 0.0 | 18.5315 | 0 | [321, 325] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174529__689.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 577 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_174551__387 | 0 | 0.0 | 22.3068 | 1 | [321, 396] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174551__387.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 578 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_174609__581 | 0 | 0.0 | 16.8155 | 0 | [321, 292] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174609__581.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 579 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_174835__337 | 4 | 0.0 | 18.2052 | 2 | [415, 308] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174835__337.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 580 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_174857__233 | 0 | 0.0 | 21.4252 | 0 | [415, 353] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174857__233.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 581 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_174912__613 | 0 | 0.0 | 14.6523 | 0 | [415, 241] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_174912__613.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 582 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_174748__762 | 0 | 0.0 | 20.5169 | 0 | [412, 352] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174748__762.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 583 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_174758__672 | 0 | 0.0 | 8.77152 | 0 | [412, 127] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174758__672.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 584 | Apple-MacBook-Pro-M1 | audi_filter | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_174817__479 | 0 | 0.0 | 19.4392 | 0 | [412, 333] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_174817__479.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 585 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_213311__618 | 0 | 0.0 | 16.5616 | 0 | [1, 499] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_213311__618.json | 0.0 | missing | missing | missing | |
| 586 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_213328__160 | 0 | 0.0 | 16.282 | 0 | [1, 491] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_213328__160.json | 25.0 | missing | missing | missing | |
| 587 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231224_233708__463 | 0 | 0.0 | 3.67477 | 0 | [111, 79] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_233708__463.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 588 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231224_233720__806 | 0 | 0.0 | 11.4593 | 0 | [111, 281] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231224_233720__806.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 589 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_214023__652 | 0 | 0.0 | 8.5989 | 0 | [111, 207] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_214023__652.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 590 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_213227__461 | 0 | 0.0 | 17.2195 | 0 | [1, 511] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_213227__461.json | 25.0 | missing | missing | missing | |
| 591 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_213235__831 | 0 | 0.0 | 8.55477 | 0 | [1, 265] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_213235__831.json | 25.0 | missing | missing | missing | |
| 592 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_233700__276 | 0 | 0.0 | 3.54085 | 0 | [152, 71] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233700__276.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 593 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_233704__668 | 0 | 0.0 | 4.04586 | 0 | [152, 84] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233704__668.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 594 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_214015__467 | 0 | 0.0 | 6.7531 | 0 | [152, 154] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_214015__467.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 595 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_213131__968 | 0 | 0.0 | 28.7351 | 0 | [1, 774] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_213131__968.json | 0.0 | missing | missing | missing | |
| 596 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_213154__282 | 0 | 0.0 | 22.4363 | 0 | [1, 619] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_213154__282.json | 0.0 | missing | missing | missing | |
| 597 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_233641__328 | 0 | 0.0 | 16.933 | 0 | [332, 244] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233641__328.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 598 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_233657__267 | 0 | 0.0 | 16.1602 | 0 | [332, 362] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233657__267.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 599 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_214008__365 | 0 | 0.0 | 30.5404 | 0 | [332, 582] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_214008__365.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 600 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_213610__868 | 0 | 0.0 | 19.1421 | 0 | [1, 521] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_213610__868.json | 25.0 | missing | missing | missing | |
| 601 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_213630__338 | 0 | 0.0 | 20.1993 | 0 | [1, 547] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_213630__338.json | 25.0 | missing | missing | missing | |
| 602 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_233826__783 | 0 | 0.0 | 15.4193 | 0 | [419, 329] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233826__783.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 603 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_233842__547 | 0 | 0.0 | 15.9489 | 0 | [419, 342] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233842__547.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 604 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_214126__873 | 0 | 0.0 | 28.7342 | 0 | [419, 649] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_214126__873.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 605 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_213508__315 | 0 | 0.0 | 28.6565 | 0 | [1, 753] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_213508__315.json | 25.0 | missing | missing | missing | |
| 606 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_213532__982 | 0 | 0.0 | 23.2481 | 0 | [1, 623] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_213532__982.json | 25.0 | missing | missing | missing | |
| 607 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_233752__628 | 0 | 0.0 | 8.05931 | 0 | [417, 146] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233752__628.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 608 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_233811__441 | 0 | 0.0 | 18.4726 | 0 | [417, 404] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233811__441.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 609 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_214057__899 | 0 | 0.0 | 33.4894 | 0 | [417, 760] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_214057__899.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 610 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_223421__490 | 0 | 0.0 | 13.7961 | 0 | [110, 431] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_223421__490.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 611 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_223432__207 | 0 | 0.0 | 10.8264 | 0 | [110, 336] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_223432__207.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 612 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_223443__532 | 0 | 0.0 | 10.209 | 0 | [110, 316] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_223443__532.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 613 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_223453__642 | 0 | 0.0 | 10.5213 | 0 | [110, 326] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_223453__642.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 614 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_223502__365 | 0 | 0.0 | 7.57016 | 0 | [110, 230] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_223502__365.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 615 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_223352__241 | 0 | 0.0 | 4.00805 | 0 | [151, 108] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_223352__241.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 616 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_223355__144 | 0 | 0.0 | 3.25964 | 0 | [151, 84] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_223355__144.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 617 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_223359__315 | 0 | 0.0 | 3.53285 | 0 | [151, 93] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_223359__315.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 618 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_223403__366 | 0 | 0.0 | 3.93564 | 0 | [151, 106] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_223403__366.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 619 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_223407__371 | 0 | 0.0 | 4.05486 | 0 | [151, 110] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_223407__371.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 620 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223252__281 | 0 | 0.0 | 12.9609 | 0 | [331, 286] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223252__281.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 621 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_223307__939 | 0 | 0.0 | 14.5606 | 0 | [331, 410] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223307__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 622 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_223320__446 | 0 | 0.0 | 13.0876 | 0 | [331, 365] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223320__446.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 623 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223337__363 | 5 | 0.0 | 17.4109 | 2 | [331, 497] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223337__363.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 624 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223347__839 | 0 | 0.0 | 9.32001 | 0 | [331, 247] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223347__839.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 625 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_223644__253 | 0 | 0.0 | 11.4823 | 0 | [418, 298] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223644__253.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 626 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_223657__511 | 0 | 0.0 | 13.4346 | 0 | [418, 358] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223657__511.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 627 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_223711__312 | 0 | 0.0 | 13.5921 | 0 | [418, 363] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223711__312.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 628 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_223727__734 | 0 | 0.0 | 15.8653 | 0 | [418, 432] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223727__734.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 629 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_223740__743 | 0 | 0.0 | 12.3264 | 0 | [418, 324] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_223740__743.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 630 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_223528__818 | 0 | 0.0 | 26.524 | 0 | [416, 749] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_223528__818.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 631 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_223541__855 | 0 | 0.0 | 12.9143 | 0 | [416, 347] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_223541__855.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 632 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_223606__575 | 0 | 0.0 | 24.1432 | 0 | [416, 679] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_223606__575.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 633 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_223620__729 | 0 | 0.0 | 14.061 | 0 | [416, 381] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_223620__729.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 634 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_223632__421 | 5 | 0.0 | 11.3679 | 2 | [416, 299] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_223632__421.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 635 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_224037__955 | 0 | 0.0 | 15.1425 | 0 | [110, 373] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_224037__955.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 636 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_224053__403 | 0 | 0.0 | 14.5193 | 0 | [110, 357] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_224053__403.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 637 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_224104__372 | 0 | 0.0 | 11.4335 | 0 | [110, 279] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_224104__372.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 638 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_224118__722 | 0 | 0.0 | 14.4321 | 0 | [110, 355] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_224118__722.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 639 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_224134__742 | 0 | 0.0 | 15.7617 | 0 | [110, 389] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_224134__742.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 640 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_223948__417 | 0 | 0.0 | 9.81254 | 0 | [151, 232] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_223948__417.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 641 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_223954__786 | 0 | 0.0 | 5.5404 | 0 | [151, 122] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_223954__786.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 642 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224011__465 | 0 | 0.0 | 16.6735 | 0 | [151, 406] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_224011__465.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 643 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_224017__659 | 0 | 0.0 | 5.26489 | 0 | [151, 115] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_224017__659.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 644 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224022__596 | 0 | 0.0 | 5.02228 | 0 | [151, 109] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_224022__596.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 645 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_223803__117 | 0 | 0.0 | 22.3678 | 0 | [331, 492] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223803__117.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 646 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223830__117 | 0 | 0.0 | 26.934 | 0 | [331, 622] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223830__117.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 647 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223848__998 | 0 | 0.0 | 18.4523 | 0 | [331, 417] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223848__998.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 648 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223909__726 | 0 | 0.0 | 20.5014 | 0 | [331, 467] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223909__726.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 649 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_223938__552 | 0 | 0.0 | 28.7054 | 0 | [331, 664] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_223938__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 650 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_224434__413 | 0 | 0.0 | 23.0681 | 0 | [418, 512] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224434__413.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 651 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_224459__908 | 0 | 0.0 | 24.5778 | 0 | [418, 548] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224459__908.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 652 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_224521__975 | 0 | 0.0 | 22.2318 | 0 | [418, 492] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224521__975.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 653 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_224550__878 | 0 | 0.0 | 28.7574 | 0 | [418, 647] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224550__878.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 654 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_224605__506 | 0 | 0.0 | 15.1288 | 0 | [418, 320] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224605__506.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 655 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224211__746 | 0 | 0.0 | 37.1038 | 0 | [416, 844] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_224211__746.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 656 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224233__756 | 0 | 0.0 | 21.1316 | 0 | [416, 469] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_224233__756.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 657 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_224302__425 | 0 | 0.0 | 28.5696 | 0 | [416, 646] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_224302__425.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 658 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_224333__291 | 0 | 0.0 | 30.8144 | 0 | [416, 699] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_224333__291.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 659 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_224411__415 | 0 | 0.0 | 38.0831 | 0 | [416, 867] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_224411__415.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 660 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_115844__738 | 0 | 0.0 | 15.756 | 0 | [110, 279] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_115844__738.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 661 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231226_115901__349 | 0 | 0.0 | 17.576 | 0 | [110, 307] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_115901__349.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 662 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_215211__430 | 0 | 0.0 | 34.1386 | 0 | [110, 622] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_215211__430.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 663 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_115822__128 | 0 | 0.0 | 5.62752 | 0 | [151, 88] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_115822__128.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 664 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_115828__798 | 0 | 0.0 | 6.03432 | 0 | [151, 95] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_115828__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 665 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_215137__391 | 0 | 0.0 | 5.56676 | 0 | [151, 89] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_215137__391.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 666 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_115746__605 | 0 | 0.0 | 50.2133 | 0 | [331, 855] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_115746__605.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 667 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_115816__822 | 0 | 0.0 | 29.3349 | 0 | [331, 496] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_115816__822.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 668 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_215132__980 | 0 | 0.0 | 38.8799 | 0 | [331, 514] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215132__980.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 669 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_120102__131 | 0 | 0.0 | 18.2393 | 0 | [418, 288] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120102__131.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 670 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_120131__262 | 0 | 0.0 | 28.9466 | 0 | [418, 480] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120131__262.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 671 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_215332__847 | 0 | 0.0 | 42.0864 | 0 | [418, 715] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_215332__847.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 672 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_120006__941 | 0 | 0.0 | 33.8602 | 0 | [416, 558] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120006__941.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 673 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_120044__870 | 0 | 0.0 | 37.3807 | 0 | [416, 631] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120044__870.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 674 | Apple-MacBook-Pro-M1 | audi_filter | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_215249__376 | 0 | 0.0 | 37.2972 | 0 | [416, 633] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_215249__376.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 675 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_102046__388 | 0 | 0.0 | 39.7457 | 1 | [116, 226] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_102046__388.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 676 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_102132__798 | 0 | 0.0 | 45.4951 | 0 | [116, 261] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_102132__798.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 677 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_102204__523 | 5 | 0.0 | 31.4011 | 2 | [116, 175] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_102204__523.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 678 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_142103__624 | 0 | 0.0 | 56.3527 | 0 | [116, 325] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_142103__624.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 679 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_142156__424 | 0 | 0.0 | 52.529 | 0 | [116, 302] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_142156__424.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 680 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_101845__468 | 0 | 0.0 | 14.1809 | 0 | [155, 64] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_101845__468.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 681 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_101937__788 | 0 | 0.0 | 52.5525 | 0 | [155, 298] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_101937__788.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 682 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_102006__235 | 0 | 0.0 | 28.3406 | 0 | [155, 151] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_102006__235.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 683 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_141947__162 | 0 | 0.0 | 19.3516 | 0 | [155, 95] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_141947__162.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 684 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_142007__698 | 0 | 0.0 | 20.4496 | 0 | [155, 102] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_142007__698.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 685 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_101623__802 | 0 | 0.0 | 142.62 | 0 | [343, 760] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101623__802.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 686 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_101708__241 | 0 | 0.0 | 43.5989 | 0 | [343, 211] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101708__241.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 687 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_101830__125 | 0 | 0.0 | 82.4544 | 0 | [343, 440] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_101830__125.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 688 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_141740__494 | 0 | 0.0 | 136.569 | 0 | [343, 746] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_141740__494.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 689 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_141927__896 | 0 | 0.0 | 106.221 | 0 | [343, 574] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_141927__896.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 690 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_102758__758 | 5 | 0.0 | 62.2628 | 2 | [429, 298] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_102758__758.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 691 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_102849__192 | 0 | 0.0 | 50.6756 | 0 | [429, 232] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_102849__192.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 692 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_102917__791 | 0 | 0.0 | 27.7508 | 0 | [429, 97] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_102917__791.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 693 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_142501__399 | 4 | 0.0 | 60.2849 | 2 | [429, 289] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_142501__399.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 694 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_142631__296 | 0 | 0.0 | 89.4277 | 0 | [429, 458] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_142631__296.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 695 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_102346__872 | 0 | 0.0 | 101.644 | 0 | [427, 531] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_102346__872.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 696 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_102500__546 | 0 | 0.0 | 73.5209 | 0 | [427, 369] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_102500__546.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 697 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_102654__250 | 0 | 0.0 | 114.123 | 0 | [427, 576] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_102654__250.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 698 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_142248__206 | 0 | 0.0 | 51.1402 | 0 | [427, 236] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_142248__206.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 699 | Apple-MacBook-Pro-M1 | audi_filter | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_142400__452 | 5 | 0.0 | 71.5277 | 2 | [427, 355] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_142400__452.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 700 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_213840__726 | 0 | 0.0 | 22.8524 | 0 | [1, 669] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_213840__726.json | 25.0 | missing | missing | missing | |
| 701 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_213858__640 | 0 | 0.0 | 18.3056 | 0 | [1, 547] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_213858__640.json | 25.0 | missing | missing | missing | |
| 702 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231224_233946__782 | 0 | 0.0 | 10.7987 | 0 | [119, 263] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231224_233946__782.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 703 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231224_233957__126 | 0 | 0.0 | 10.4082 | 0 | [119, 253] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231224_233957__126.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 704 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_214205__557 | 0 | 0.0 | 9.74859 | 0 | [119, 236] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_214205__557.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 705 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_213759__420 | 0 | 0.0 | 13.4379 | 0 | [1, 406] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_213759__420.json | 25.0 | missing | missing | missing | |
| 706 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_213803__115 | 0 | 0.0 | 4.49214 | 0 | [1, 142] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_213803__115.json | 0.0 | missing | missing | missing | |
| 707 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_233928__721 | 0 | 0.0 | 4.70385 | 0 | [160, 101] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233928__721.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 708 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_233935__166 | 0 | 0.0 | 6.90787 | 0 | [160, 158] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233935__166.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 709 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_214156__580 | 0 | 0.0 | 4.73735 | 0 | [160, 102] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_214156__580.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 710 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_213723__588 | 0 | 0.0 | 34.3122 | 0 | [1, 906] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_213723__588.json | 25.0 | missing | missing | missing | |
| 711 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_213737__829 | 0 | 0.0 | 13.7169 | 0 | [1, 392] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_213737__829.json | 0.0 | missing | missing | missing | |
| 712 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_233904__193 | 0 | 0.0 | 22.0211 | 0 | [340, 349] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233904__193.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 713 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_233923__539 | 4 | 0.0 | 18.3041 | 2 | [340, 414] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233923__539.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 714 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_214151__747 | 0 | 0.0 | 24.9021 | 0 | [340, 430] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_214151__747.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 715 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_214140__839 | 0 | 0.0 | 24.5033 | 0 | [1, 653] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_214140__839.json | 25.0 | missing | missing | missing | |
| 716 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_214147__964 | 0 | 0.0 | 7.21136 | 0 | [1, 206] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_214147__964.json | 0.0 | missing | missing | missing | |
| 717 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_234106__311 | 0 | 0.0 | 14.1238 | 0 | [427, 296] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234106__311.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 718 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_234123__290 | 0 | 0.0 | 16.4729 | 0 | [427, 354] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234123__290.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 719 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_214244__622 | 0 | 0.0 | 19.976 | 0 | [427, 438] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_214244__622.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 720 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_214023__424 | 0 | 0.0 | 19.5782 | 0 | [1, 532] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_214023__424.json | 25.0 | missing | missing | missing | |
| 721 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_214052__228 | 0 | 0.0 | 29.2956 | 0 | [1, 768] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_214052__228.json | 0.0 | missing | missing | missing | |
| 722 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_234037__734 | 0 | 0.0 | 17.5782 | 0 | [425, 381] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_234037__734.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 723 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_234052__285 | 0 | 0.0 | 14.3603 | 0 | [425, 302] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_234052__285.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 724 | Apple-MacBook-Pro-M1 | audi_filter | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_214224__913 | 0 | 0.0 | 18.1037 | 0 | [425, 393] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_214224__913.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 725 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 5 | 20231213_231953__596 | 0 | 0.0 | 19.4905 | 0 | [112, 556] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231213_231953__596.json | 0.0 | missing | missing | missing | |
| 726 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231224_224416__986 | 0 | 0.0 | 7.62096 | 0 | [117, 235] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_224416__986.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 727 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231224_224422__889 | 0 | 0.0 | 5.86776 | 0 | [117, 177] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_224422__889.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 728 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231226_211733__576 | 0 | 0.0 | 9.62439 | 0 | [117, 302] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_211733__576.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 729 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_231934__963 | 0 | 0.0 | 15.021 | 0 | [141, 420] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231213_231934__963.json | 0.0 | missing | missing | missing | |
| 730 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224358__556 | 0 | 0.0 | 10.6317 | 0 | [158, 327] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_224358__556.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 731 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224408__499 | 0 | 0.0 | 9.63523 | 1 | [158, 296] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_224408__499.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 732 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_211723__898 | 0 | 0.0 | 6.36802 | 1 | [158, 189] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_211723__898.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 733 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_231919__844 | 0 | 0.0 | 21.1151 | 0 | [311, 516] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231213_231919__844.json | 25.0 | missing | missing | missing | |
| 734 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_224329__341 | 0 | 0.0 | 25.7076 | 0 | [338, 593] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224329__341.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 735 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_224346__779 | 0 | 0.0 | 17.3167 | 0 | [338, 498] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224346__779.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 736 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_211716__848 | 0 | 0.0 | 18.334 | 0 | [338, 374] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211716__848.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 737 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_232046__172 | 0 | 0.0 | 17.6245 | 0 | [11, 475] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231213_232046__172.json | 0.0 | missing | missing | missing | |
| 738 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_224534__730 | 0 | 0.0 | 21.3382 | 0 | [425, 599] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224534__730.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 739 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_224547__734 | 0 | 0.0 | 12.872 | 0 | [425, 343] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224547__734.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 740 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_211747__237 | 0 | 0.0 | 2.44985 | 0 | [425, 15] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211747__237.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 741 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_232028__384 | 0 | 0.0 | 22.3044 | 0 | [412, 501] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231213_232028__384.json | 25.0 | missing | missing | missing | |
| 742 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_224456__348 | 0 | 0.0 | 17.4095 | 0 | [423, 484] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_224456__348.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 743 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_224512__386 | 0 | 0.0 | 16.4644 | 0 | [423, 456] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_224512__386.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 744 | Apple-MacBook-Pro-M1 | audi_filter | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_211744__202 | 4 | 0.0 | 10.4729 | 2 | [423, 272] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_211744__202.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 745 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231213_233249__106 | 0 | 0.0 | 13.9883 | 0 | [112, 402] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__InJulia__1SHOT__20231213_233249__106.json | 25.0 | missing | missing | missing | |
| 746 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231224_231038__295 | 0 | 0.0 | 21.4286 | 0 | [115, 382] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__InJulia__1SHOT__20231224_231038__295.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 747 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231224_231054__612 | 0 | 0.0 | 15.9452 | 0 | [115, 281] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__InJulia__1SHOT__20231224_231054__612.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 748 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231226_212718__778 | 0 | 0.0 | 2.96316 | 0 | [115, 37] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__InJulia__1SHOT__20231226_212718__778.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 749 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_233235__673 | 0 | 0.0 | 8.37548 | 0 | [141, 227] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231213_233235__673.json | 25.0 | missing | missing | missing | |
| 750 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_231008__340 | 0 | 0.0 | 7.60282 | 0 | [154, 120] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231224_231008__340.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 751 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_231017__392 | 0 | 0.0 | 8.54455 | 0 | [154, 137] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231224_231017__392.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 752 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_212715__465 | 0 | 0.0 | 8.50667 | 0 | [154, 136] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_212715__465.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 753 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_233227__412 | 0 | 0.0 | 26.1747 | 0 | [311, 644] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233227__412.json | 25.0 | missing | missing | missing | |
| 754 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_230944__455 | 0 | 0.0 | 32.0509 | 0 | [324, 354] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230944__455.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 755 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_231001__159 | 0 | 0.0 | 17.0012 | 0 | [324, 260] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231224_231001__159.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 756 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_212706__286 | 0 | 0.0 | 22.6313 | 0 | [324, 199] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212706__286.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 757 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_233354__787 | 0 | 0.0 | 15.8157 | 0 | [11, 429] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233354__787.json | 50.0 | missing | missing | missing | |
| 758 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_231257__882 | 0 | 0.0 | 28.0475 | 0 | [418, 435] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231224_231257__882.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 759 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_231324__848 | 0 | 0.0 | 27.0871 | 0 | [418, 418] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231224_231324__848.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 760 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_212834__337 | 0 | 0.0 | 42.8094 | 0 | [418, 685] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_212834__337.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 761 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_233338__868 | 0 | 0.0 | 24.794 | 0 | [412, 564] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231213_233338__868.json | 0.0 | missing | missing | missing | |
| 762 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_231205__714 | 0 | 0.0 | 37.4836 | 0 | [415, 598] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231224_231205__714.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 763 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_231229__971 | 0 | 0.0 | 23.2519 | 0 | [415, 356] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231224_231229__971.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 764 | Apple-MacBook-Pro-M1 | audi_filter | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_212751__683 | 0 | 0.0 | 33.8862 | 0 | [415, 540] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_212751__683.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 765 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_214326__954 | 0 | 0.0 | 15.061 | 0 | [112, 435] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_214326__954.json | 25.0 | missing | missing | missing | |
| 766 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231219_214344__971 | 0 | 0.0 | 17.7868 | 0 | [1, 533] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_214344__971.json | 0.0 | missing | missing | missing | |
| 767 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_214407__151 | 0 | 0.0 | 23.1137 | 0 | [1, 676] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_214407__151.json | 25.0 | missing | missing | missing | |
| 768 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231224_234221__551 | 0 | 0.0 | 24.0657 | 0 | [112, 883] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231224_234221__551.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 769 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231224_234233__458 | 0 | 0.0 | 11.6301 | 0 | [112, 441] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231224_234233__458.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 770 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_214306__922 | 0 | 0.0 | 15.6848 | 0 | [1, 469] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_214306__922.json | 0.0 | missing | missing | missing | |
| 771 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_214311__611 | 0 | 0.0 | 4.92129 | 0 | [1, 155] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_214311__611.json | 0.0 | missing | missing | missing | |
| 772 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_234152__302 | 0 | 0.0 | 15.8027 | 0 | [149, 586] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231224_234152__302.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 773 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_234157__339 | 0 | 0.0 | 5.01382 | 0 | [149, 182] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231224_234157__339.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 774 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_214323__915 | 0 | 0.0 | 29.0251 | 0 | [149, 1035] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_214323__915.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 775 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_214224__855 | 0 | 0.0 | 14.7061 | 0 | [1, 419] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_214224__855.json | 0.0 | missing | missing | missing | |
| 776 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_214240__884 | 0 | 0.0 | 15.4339 | 0 | [1, 438] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_214240__884.json | 0.0 | missing | missing | missing | |
| 777 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_234132__894 | 0 | 0.0 | 8.79392 | 0 | [318, 157] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234132__894.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 778 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_234137__267 | 0 | 0.0 | 4.35688 | 0 | [318, 130] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234137__267.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 779 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_214254__220 | 0 | 0.0 | 9.13263 | 0 | [318, 179] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_214254__220.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 780 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_214633__208 | 0 | 0.0 | 21.0881 | 0 | [11, 564] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_214633__208.json | 25.0 | missing | missing | missing | |
| 781 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_214651__480 | 0 | 0.0 | 17.886 | 0 | [1, 489] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_214651__480.json | 25.0 | missing | missing | missing | |
| 782 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_214708__999 | 0 | 0.0 | 17.2944 | 0 | [1, 474] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_214708__999.json | 25.0 | missing | missing | missing | |
| 783 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_234340__185 | 0 | 0.0 | 10.5159 | 0 | [401, 340] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234340__185.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 784 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_234349__720 | 0 | 0.0 | 8.87795 | 0 | [401, 281] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234349__720.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 785 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_214530__938 | 0 | 0.0 | 23.4796 | 0 | [412, 533] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_214530__938.json | 25.0 | missing | missing | missing | |
| 786 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_214554__571 | 0 | 0.0 | 23.8673 | 0 | [1, 638] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_214554__571.json | 25.0 | missing | missing | missing | |
| 787 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_214612__400 | 0 | 0.0 | 17.754 | 0 | [1, 486] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_214612__400.json | 25.0 | missing | missing | missing | |
| 788 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_234326__790 | 0 | 0.0 | 26.2133 | 0 | [398, 872] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231224_234326__790.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 789 | Apple-MacBook-Pro-M1 | audi_filter | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_234329__784 | 0 | 0.0 | 3.59231 | 0 | [398, 86] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231224_234329__784.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 790 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231213_233436__898 | 0 | 0.0 | 15.124 | 0 | [112, 434] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231213_233436__898.json | 25.0 | missing | missing | missing | |
| 791 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231224_231743__437 | 4 | 0.0 | 54.3185 | 2 | [123, 416] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231224_231743__437.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 792 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231224_231818__635 | 5 | 0.0 | 33.6594 | 2 | [123, 252] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231224_231818__635.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 793 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231226_213108__692 | 0 | 0.0 | 33.0507 | 0 | [123, 246] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_213108__692.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 794 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_233421__636 | 0 | 0.0 | 4.65878 | 0 | [141, 115] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231213_233421__636.json | 50.0 | missing | missing | missing | |
| 795 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_231559__406 | 5 | 0.0 | 23.5117 | 2 | [162, 160] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231224_231559__406.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 796 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_231649__993 | 0 | 0.0 | 49.315 | 0 | [162, 365] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231224_231649__993.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 797 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_213034__973 | 0 | 0.0 | 45.1372 | 0 | [162, 331] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_213034__973.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 798 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_233416__661 | 0 | 0.0 | 21.0427 | 0 | [311, 513] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233416__661.json | 0.0 | missing | missing | missing | |
| 799 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_231454__311 | 0 | 0.0 | 90.6269 | 0 | [332, 477] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_231454__311.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 800 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_231535__880 | 0 | 0.0 | 40.2175 | 0 | [332, 263] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_231535__880.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 801 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_212949__376 | 5 | 0.0 | 74.5487 | 2 | [332, 366] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212949__376.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 802 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_233540__548 | 0 | 0.0 | 20.5867 | 0 | [11, 549] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233540__548.json | 25.0 | missing | missing | missing | |
| 803 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_232147__405 | 5 | 0.0 | 53.5722 | 2 | [426, 346] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_232147__405.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 804 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_232252__494 | 0 | 0.0 | 64.9799 | 0 | [426, 433] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_232252__494.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 805 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_213231__403 | 5 | 0.0 | 45.0821 | 2 | [426, 281] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_213231__403.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 806 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_233520__723 | 0 | 0.0 | 25.2515 | 0 | [412, 575] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231213_233520__723.json | 0.0 | missing | missing | missing | |
| 807 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_232005__484 | 0 | 0.0 | 25.9733 | 0 | [423, 134] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231224_232005__484.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 808 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_232053__991 | 0 | 0.0 | 47.9716 | 0 | [423, 305] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231224_232053__991.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 809 | Apple-MacBook-Pro-M1 | audi_filter | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_213146__866 | 5 | 0.0 | 37.6451 | 2 | [423, 225] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_213146__866.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 810 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_212733__752 | 0 | 0.0 | 15.7299 | 0 | [1, 476] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_212733__752.json | 25.0 | missing | missing | missing | |
| 811 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_212748__114 | 0 | 0.0 | 15.0261 | 0 | [1, 456] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_212748__114.json | 25.0 | missing | missing | missing | |
| 812 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231224_233432__346 | 0 | 0.0 | 18.7286 | 0 | [119, 310] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231224_233432__346.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 813 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231224_233446__959 | 0 | 0.0 | 13.5703 | 0 | [119, 221] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231224_233446__959.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 814 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231226_213853__919 | 0 | 0.0 | 13.2754 | 0 | [119, 216] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_213853__919.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 815 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_212648__356 | 0 | 0.0 | 11.022 | 0 | [1, 337] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_212648__356.json | 0.0 | missing | missing | missing | |
| 816 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_212656__375 | 0 | 0.0 | 8.73431 | 0 | [1, 270] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_212656__375.json | 25.0 | missing | missing | missing | |
| 817 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_233358__844 | 0 | 0.0 | 13.7336 | 0 | [160, 219] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233358__844.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 818 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_233413__359 | 0 | 0.0 | 15.1158 | 0 | [160, 243] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231224_233413__359.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 819 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_213839__487 | 0 | 0.0 | 9.82033 | 0 | [160, 151] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_213839__487.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 820 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_212551__205 | 0 | 0.0 | 9.93304 | 0 | [1, 289] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_212551__205.json | 0.0 | missing | missing | missing | |
| 821 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_212617__795 | 0 | 0.0 | 25.8143 | 0 | [1, 703] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_212617__795.json | 25.0 | missing | missing | missing | |
| 822 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_233326__936 | 0 | 0.0 | 28.0832 | 0 | [340, 262] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233326__936.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 823 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_233344__659 | 0 | 0.0 | 18.8453 | 0 | [340, 276] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231224_233344__659.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 824 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_213829__156 | 0 | 0.0 | 30.6923 | 0 | [340, 327] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_213829__156.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 825 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_213040__609 | 0 | 0.0 | 28.087 | 0 | [1, 739] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_213040__609.json | 25.0 | missing | missing | missing | |
| 826 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_213041__202 | 0 | 0.0 | 1.4005 | 0 | [1, 41] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_213041__202.json | 0.0 | missing | missing | missing | |
| 827 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_233607__846 | 0 | 0.0 | 17.0352 | 0 | [427, 231] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233607__846.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 828 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_233624__405 | 0 | 0.0 | 16.2612 | 0 | [427, 218] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231224_233624__405.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 829 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_213937__216 | 0 | 0.0 | 24.5057 | 0 | [427, 355] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_213937__216.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 830 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_212929__969 | 0 | 0.0 | 24.7569 | 0 | [1, 660] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_212929__969.json | 25.0 | missing | missing | missing | |
| 831 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_212953__314 | 0 | 0.0 | 23.4219 | 0 | [1, 627] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_212953__314.json | 25.0 | missing | missing | missing | |
| 832 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_233534__311 | 0 | 0.0 | 21.0067 | 0 | [425, 297] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233534__311.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 833 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_233550__155 | 0 | 0.0 | 16.62 | 0 | [425, 224] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231224_233550__155.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 834 | Apple-MacBook-Pro-M1 | audi_filter | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_213913__900 | 0 | 0.0 | 20.1565 | 0 | [425, 283] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_213913__900.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 835 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231213_233053__748 | 0 | 0.0 | 13.593 | 0 | [112, 391] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__InJulia__1SHOT__20231213_233053__748.json | 25.0 | missing | missing | missing | |
| 836 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231224_230823__904 | 0 | 0.0 | 5.81542 | 0 | [118, 321] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__InJulia__1SHOT__20231224_230823__904.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 837 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231224_230826__952 | 0 | 0.0 | 3.78288 | 0 | [118, 207] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__InJulia__1SHOT__20231224_230826__952.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 838 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231226_212629__169 | 0 | 0.0 | 3.3126 | 0 | [118, 180] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_212629__169.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 839 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231213_233040__791 | 0 | 0.0 | 10.8457 | 0 | [141, 300] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231213_233040__791.json | 25.0 | missing | missing | missing | |
| 840 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_230812__434 | 0 | 0.0 | 2.77818 | 0 | [155, 142] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231224_230812__434.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 841 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_230817__108 | 0 | 0.0 | 4.43217 | 0 | [155, 236] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231224_230817__108.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 842 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_212626__925 | 0 | 0.0 | 2.25792 | 0 | [155, 112] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_212626__925.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 843 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_233029__960 | 0 | 0.0 | 21.2506 | 0 | [311, 519] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233029__960.json | 0.0 | missing | missing | missing | |
| 844 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_230802__781 | 0 | 0.0 | 8.81938 | 0 | [321, 270] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230802__781.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 845 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_230810__717 | 0 | 0.0 | 8.06067 | 0 | [321, 385] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231224_230810__717.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 846 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_212624__329 | 0 | 0.0 | 18.5575 | 0 | [321, 766] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212624__329.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 847 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_233201__833 | 0 | 0.0 | 25.6671 | 0 | [11, 674] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233201__833.json | 25.0 | missing | missing | missing | |
| 848 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_230900__549 | 0 | 0.0 | 8.13601 | 0 | [405, 367] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230900__549.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 849 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_230912__225 | 0 | 0.0 | 11.3215 | 0 | [405, 522] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231224_230912__225.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 850 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_212643__409 | 0 | 0.0 | 6.62203 | 0 | [405, 290] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_212643__409.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 851 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_233134__861 | 0 | 0.0 | 21.8603 | 0 | [412, 490] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231213_233134__861.json | 50.0 | missing | missing | missing | |
| 852 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_230844__933 | 0 | 0.0 | 7.24821 | 0 | [403, 322] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231224_230844__933.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 853 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231224_230852__414 | 0 | 0.0 | 7.95047 | 0 | [403, 358] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231224_230852__414.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 854 | Apple-MacBook-Pro-M1 | audi_filter | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_212637__885 | 0 | 0.0 | 7.48784 | 0 | [403, 335] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_212637__885.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 855 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231213_232142__371 | 0 | 0.0 | 17.2403 | 0 | [112, 494] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__InJulia__1SHOT__20231213_232142__371.json | 25.0 | missing | missing | missing | |
| 856 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231224_224702__570 | 0 | 0.0 | 10.3447 | 0 | [119, 322] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_224702__570.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 857 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231224_224713__996 | 0 | 0.0 | 11.1954 | 0 | [119, 351] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_224713__996.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 858 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231226_211838__359 | 0 | 0.0 | 6.86297 | 0 | [119, 210] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_211838__359.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 859 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_232124__248 | 0 | 0.0 | 17.4209 | 0 | [141, 487] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231213_232124__248.json | 0.0 | missing | missing | missing | |
| 860 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224643__370 | 0 | 0.0 | 11.9172 | 0 | [160, 369] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_224643__370.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 861 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_224651__372 | 0 | 0.0 | 7.36829 | 0 | [160, 218] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_224651__372.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 862 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_211831__297 | 0 | 0.0 | 21.8422 | 0 | [160, 681] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_211831__297.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 863 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_232107__914 | 0 | 0.0 | 20.8324 | 0 | [311, 509] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231213_232107__914.json | 25.0 | missing | missing | missing | |
| 864 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_224617__476 | 0 | 0.0 | 30.7815 | 0 | [340, 735] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224617__476.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 865 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_224631__187 | 0 | 0.0 | 14.0719 | 0 | [340, 400] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224631__187.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 866 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_211809__923 | 0 | 0.0 | 22.2716 | 0 | [340, 495] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_211809__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 867 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_232239__130 | 0 | 0.0 | 20.3745 | 0 | [11, 544] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231213_232239__130.json | 50.0 | missing | missing | missing | |
| 868 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_224817__594 | 0 | 0.0 | 16.7572 | 0 | [427, 463] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224817__594.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 869 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_224838__377 | 0 | 0.0 | 20.8225 | 0 | [427, 587] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_224838__377.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 870 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_211911__651 | 0 | 0.0 | 13.5888 | 0 | [427, 368] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_211911__651.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 871 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_232219__385 | 0 | 0.0 | 27.7508 | 0 | [412, 637] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231213_232219__385.json | 25.0 | missing | missing | missing | |
| 872 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_224745__587 | 0 | 0.0 | 13.4577 | 0 | [425, 363] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_224745__587.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 873 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_224800__824 | 0 | 0.0 | 14.5616 | 0 | [425, 397] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_224800__824.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 874 | Apple-MacBook-Pro-M1 | audi_filter | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_211858__692 | 0 | 0.0 | 19.481 | 0 | [425, 548] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_211858__692.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 875 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231213_232326__504 | 0 | 0.0 | 20.5142 | 0 | [112, 584] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__InJulia__1SHOT__20231213_232326__504.json | 25.0 | missing | missing | missing | |
| 876 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231224_225221__793 | 0 | 0.0 | 37.5888 | 0 | [116, 273] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_225221__793.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 877 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231224_225301__996 | 0 | 0.0 | 39.6453 | 0 | [116, 289] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_225301__996.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 878 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231226_212155__973 | 0 | 0.0 | 49.9478 | 0 | [116, 368] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_212155__973.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 879 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_232305__263 | 0 | 0.0 | 10.2219 | 0 | [141, 282] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231213_232305__263.json | 0.0 | missing | missing | missing | |
| 880 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_225120__521 | 0 | 0.0 | 33.8258 | 1 | [155, 238] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_225120__521.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 881 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_225143__501 | 0 | 0.0 | 23.3089 | 0 | [155, 157] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_225143__501.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 882 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_212105__910 | 0 | 0.0 | 30.023 | 0 | [155, 210] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_212105__910.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 883 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_232255__458 | 0 | 0.0 | 15.7184 | 0 | [311, 373] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231213_232255__458.json | 0.0 | missing | missing | missing | |
| 884 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_224932__141 | 0 | 0.0 | 54.1286 | 0 | [343, 164] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_224932__141.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 885 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_225045__451 | 0 | 0.0 | 72.7819 | 0 | [343, 490] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_225045__451.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 886 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_212035__514 | 0 | 0.0 | 82.9453 | 0 | [343, 404] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_212035__514.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 887 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_232434__585 | 0 | 0.0 | 17.3704 | 0 | [11, 468] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231213_232434__585.json | 25.0 | missing | missing | missing | |
| 888 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_225700__945 | 0 | 0.0 | 52.2443 | 0 | [429, 322] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231224_225700__945.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 889 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_225826__331 | 0 | 0.0 | 85.5014 | 0 | [429, 562] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231224_225826__331.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 890 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_212329__461 | 0 | 0.0 | 41.3752 | 0 | [429, 243] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_212329__461.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 891 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_232416__618 | 0 | 0.0 | 31.815 | 0 | [412, 733] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231213_232416__618.json | 0.0 | missing | missing | missing | |
| 892 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_225518__574 | 0 | 0.0 | 62.5352 | 0 | [427, 397] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_225518__574.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 893 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_225608__381 | 0 | 0.0 | 49.3816 | 0 | [427, 301] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_225608__381.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 894 | Apple-MacBook-Pro-M1 | audi_filter | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_212247__769 | 0 | 0.0 | 52.0133 | 0 | [427, 322] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/audi_filter/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_212247__769.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 895 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231213_234202__292 | 0 | 0.0 | 14.4885 | 0 | [60, 435] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231213_234202__292.json | 0.0 | missing | missing | missing | |
| 896 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_000132__115 | 0 | 0.0 | 13.9409 | 0 | [68, 251] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_000132__115.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 897 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_000155__638 | 0 | 0.0 | 23.0082 | 0 | [68, 420] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_000155__638.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 898 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231226_220105__496 | 0 | 0.0 | 12.926 | 0 | [68, 233] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_220105__496.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 899 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234148__621 | 0 | 0.0 | 6.81969 | 0 | [90, 196] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231213_234148__621.json | 50.0 | missing | missing | missing | |
| 900 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_000114__853 | 0 | 0.0 | 7.30339 | 0 | [107, 120] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_000114__853.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 901 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_000118__488 | 0 | 0.0 | 3.53673 | 0 | [107, 48] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_000118__488.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 902 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_220052__882 | 0 | 0.0 | 3.35666 | 0 | [107, 45] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_220052__882.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 903 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234141__159 | 0 | 0.0 | 14.3533 | 0 | [201, 379] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234141__159.json | 50.0 | missing | missing | missing | |
| 904 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000058__606 | 0 | 0.0 | 14.7881 | 0 | [219, 57] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000058__606.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 905 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000107__340 | 0 | 0.0 | 8.9092 | 0 | [219, 134] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000107__340.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 906 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_220048__688 | 0 | 0.0 | 12.8722 | 0 | [219, 33] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_220048__688.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 907 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_234257__422 | 0 | 0.0 | 17.814 | 0 | [11, 487] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234257__422.json | 25.0 | missing | missing | missing | |
| 908 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_000257__381 | 0 | 0.0 | 8.87821 | 0 | [372, 107] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000257__381.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 909 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_000318__396 | 0 | 0.0 | 21.0202 | 0 | [372, 325] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000318__396.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 910 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_220138__485 | 0 | 0.0 | 11.7943 | 0 | [372, 161] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220138__485.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 911 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_234239__749 | 0 | 0.0 | 22.5905 | 0 | [361, 531] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231213_234239__749.json | 25.0 | missing | missing | missing | |
| 912 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_000229__766 | 0 | 0.0 | 26.1696 | 0 | [369, 416] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_000229__766.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 913 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_000248__177 | 0 | 0.0 | 19.1194 | 0 | [369, 291] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_000248__177.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 914 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_220126__377 | 0 | 0.0 | 21.4502 | 0 | [369, 335] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_220126__377.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 915 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231213_234344__984 | 0 | 0.0 | 13.6866 | 0 | [60, 412] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__InJulia__1SHOT__20231213_234344__984.json | 25.0 | missing | missing | missing | |
| 916 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_000411__226 | 0 | 0.0 | 29.6418 | 0 | [42, 546] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_000411__226.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 917 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_000443__300 | 0 | 0.0 | 31.8764 | 0 | [42, 586] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_000443__300.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 918 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234330__817 | 0 | 0.0 | 15.908 | 0 | [90, 464] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231213_234330__817.json | 50.0 | missing | missing | missing | |
| 919 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000336__692 | 0 | 0.0 | 4.16003 | 0 | [44, 70] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_000336__692.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 920 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000342__555 | 0 | 0.0 | 5.93173 | 0 | [44, 104] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_000342__555.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 921 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234314__207 | 0 | 0.0 | 17.442 | 0 | [201, 464] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234314__207.json | 50.0 | missing | missing | missing | |
| 922 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000331__819 | 0 | 0.0 | 0.877273 | 0 | [94, 1] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000331__819.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 923 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000331__991 | 0 | 0.0 | 12.5777 | 0 | [94, 30] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000331__991.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 924 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_234443__685 | 1 | 0.0 | 19.5067 | 1 | [11, 530] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234443__685.json | 67.5 | missing | missing | missing | |
| 925 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_000512__617 | 0 | 0.0 | 2.31192 | 0 | [61, 34] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000512__617.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 926 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_000515__591 | 0 | 0.0 | 2.8428 | 0 | [61, 44] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000515__591.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 927 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_234424__549 | 0 | 0.0 | 21.3398 | 0 | [361, 500] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231213_234424__549.json | 50.0 | missing | missing | missing | |
| 928 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_000503__706 | 0 | 0.0 | 4.32742 | 0 | [58, 73] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_000503__706.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 929 | Apple-MacBook-Pro-M1 | count_model_rows | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_000510__526 | 0 | 0.0 | 6.56486 | 0 | [58, 116] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_000510__526.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 930 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_214833__412 | 0 | 0.0 | 11.0637 | 0 | [1, 348] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_214833__412.json | 0.0 | missing | missing | missing | |
| 931 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_214848__877 | 0 | 0.0 | 15.1402 | 0 | [1, 467] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_214848__877.json | 0.0 | missing | missing | missing | |
| 932 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_002425__452 | 1 | 0.0 | 60.5936 | 2 | [60, 370] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_002425__452.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 933 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_002515__774 | 0 | 0.0 | 49.5821 | 0 | [60, 302] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_002515__774.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 934 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_221124__986 | 5 | 0.0 | 27.0139 | 2 | [60, 162] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_221124__986.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 935 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_214807__672 | 0 | 0.0 | 7.94208 | 0 | [1, 251] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_214807__672.json | 25.0 | missing | missing | missing | |
| 936 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_214811__771 | 0 | 0.0 | 4.51061 | 0 | [1, 145] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_214811__771.json | 25.0 | missing | missing | missing | |
| 937 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_002233__917 | 5 | 0.0 | 25.7016 | 2 | [101, 143] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_002233__917.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 938 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_002324__868 | 0 | 0.0 | 50.9779 | 0 | [101, 301] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_002324__868.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 939 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_221057__851 | 0 | 0.0 | 22.7213 | 0 | [101, 125] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_221057__851.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 940 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_214737__548 | 0 | 0.0 | 13.3937 | 0 | [1, 397] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_214737__548.json | 25.0 | missing | missing | missing | |
| 941 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_214750__193 | 0 | 0.0 | 12.8179 | 0 | [1, 381] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_214750__193.json | 0.0 | missing | missing | missing | |
| 942 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_002125__533 | 1 | 0.0 | 64.2227 | 2 | [213, 193] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_002125__533.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 943 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_002207__794 | 0 | 0.0 | 41.3104 | 0 | [213, 223] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_002207__794.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 944 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_221034__277 | 5 | 0.0 | 73.8019 | 2 | [213, 276] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221034__277.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 945 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_215104__861 | 0 | 0.0 | 14.61 | 0 | [1, 411] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215104__861.json | 25.0 | missing | missing | missing | |
| 946 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_215117__608 | 0 | 0.0 | 12.8198 | 0 | [1, 363] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215117__608.json | 25.0 | missing | missing | missing | |
| 947 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_003016__320 | 1 | 0.0 | 69.3058 | 2 | [389, 361] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003016__320.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 948 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_003052__750 | 0 | 0.0 | 35.7267 | 0 | [389, 159] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003052__750.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 949 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_221302__557 | 5 | 0.0 | 33.0222 | 2 | [389, 143] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221302__557.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 950 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_215014__120 | 0 | 0.0 | 21.3964 | 0 | [1, 586] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215014__120.json | 25.0 | missing | missing | missing | |
| 951 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_215033__129 | 0 | 0.0 | 19.2519 | 0 | [1, 532] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215033__129.json | 25.0 | missing | missing | missing | |
| 952 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_002808__553 | 5 | 0.0 | 64.2909 | 2 | [387, 331] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_002808__553.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 953 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_002906__811 | 5 | 0.0 | 57.9343 | 2 | [387, 293] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_002906__811.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 954 | Apple-MacBook-Pro-M1 | count_model_rows | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_221229__169 | 5 | 0.0 | 64.4366 | 2 | [387, 333] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_221229__169.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 955 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231226_221918__154 | 0 | 0.0 | 9.5799 | 0 | [60, 373] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_221918__154.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 956 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_102959__396 | 0 | 0.0 | 9.22674 | 0 | [60, 357] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_102959__396.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 957 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_103007__185 | 0 | 0.0 | 7.45885 | 0 | [60, 288] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_103007__185.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 958 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_103016__930 | 0 | 0.0 | 8.73325 | 0 | [60, 339] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_103016__930.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 959 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_221908__410 | 0 | 0.0 | 7.92727 | 0 | [97, 301] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_221908__410.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 960 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_102939__892 | 0 | 0.0 | 6.06691 | 0 | [97, 228] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_102939__892.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 961 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_102945__552 | 0 | 0.0 | 6.22573 | 0 | [97, 233] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_102945__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 962 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_102950__993 | 0 | 0.0 | 4.74458 | 0 | [97, 174] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_102950__993.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 963 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_221900__726 | 0 | 0.0 | 8.98386 | 0 | [205, 196] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221900__726.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 964 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_102924__102 | 0 | 0.0 | 7.11626 | 0 | [205, 126] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_102924__102.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 965 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_102927__358 | 0 | 0.0 | 2.96017 | 0 | [205, 91] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_102927__358.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 966 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_102933__484 | 0 | 0.0 | 5.47301 | 0 | [205, 187] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_102933__484.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 967 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_221935__966 | 0 | 0.0 | 8.34377 | 0 | [349, 273] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221935__966.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 968 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_103050__519 | 0 | 0.0 | 8.17975 | 0 | [349, 261] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_103050__519.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 969 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_103103__757 | 0 | 0.0 | 12.0694 | 0 | [349, 406] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_103103__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 970 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_103112__612 | 0 | 0.0 | 9.86214 | 0 | [349, 326] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_103112__612.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 971 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_221927__714 | 0 | 0.0 | 8.80065 | 0 | [346, 290] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_221927__714.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 972 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_103027__938 | 0 | 0.0 | 11.6535 | 0 | [346, 389] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_103027__938.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 973 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_103033__230 | 0 | 0.0 | 6.06742 | 0 | [346, 188] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_103033__230.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 974 | Apple-MacBook-Pro-M1 | count_model_rows | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_103042__592 | 1 | 0.0 | 8.8458 | 2 | [346, 289] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_103042__592.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 975 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | InJulia | 1SHOT | false | false | 5 | 20231213_233618__248 | 0 | 0.0 | 13.2318 | 0 | [60, 399] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__InJulia__1SHOT__20231213_233618__248.json | 0.0 | missing | missing | missing | |
| 976 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | InJulia | 1SHOT | false | false | 5 | 20231224_234455__228 | 0 | 0.0 | 13.1699 | 0 | [60, 398] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__InJulia__1SHOT__20231224_234455__228.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 977 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | InJulia | 1SHOT | false | false | 5 | 20231224_234506__267 | 0 | 0.0 | 10.9498 | 0 | [1, 344] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__InJulia__1SHOT__20231224_234506__267.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 978 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | InJulia | 1SHOT | true | false | 5 | 20231226_215449__134 | 0 | 0.0 | 8.04132 | 0 | [60, 248] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__InJulia__1SHOT__20231226_215449__134.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 979 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_233605__930 | 0 | 0.0 | 7.85529 | 0 | [90, 229] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertAsk__1SHOT__20231213_233605__930.json | 50.0 | missing | missing | missing | |
| 980 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_234436__407 | 0 | 0.0 | 7.82815 | 0 | [90, 229] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_234436__407.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 981 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_234442__153 | 0 | 0.0 | 5.74822 | 0 | [1, 183] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertAsk__1SHOT__20231224_234442__153.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 982 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231226_215441__596 | 0 | 0.0 | 9.61107 | 0 | [90, 286] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_215441__596.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 983 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_233557__202 | 0 | 0.0 | 16.5371 | 0 | [201, 438] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233557__202.json | 50.0 | missing | missing | missing | |
| 984 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_234411__984 | 0 | 0.0 | 21.9894 | 2 | [219, 446] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234411__984.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 985 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_234428__372 | 0 | 0.0 | 16.8116 | 0 | [1, 489] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234428__372.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 986 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_215431__783 | 0 | 0.0 | 15.2671 | 0 | [219, 270] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215431__783.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 987 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_233715__967 | 0 | 0.0 | 23.961 | 0 | [11, 642] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233715__967.json | 25.0 | missing | missing | missing | |
| 988 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231224_234629__220 | 0 | 0.0 | 19.5344 | 0 | [11, 532] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234629__220.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 989 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_234645__778 | 0 | 0.0 | 15.9547 | 0 | [1, 445] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234645__778.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 990 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_215527__136 | 0 | 0.0 | 15.9301 | 0 | [11, 445] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_215527__136.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 991 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_233651__679 | 0 | 0.0 | 19.898 | 0 | [361, 462] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapTask__1SHOT__20231213_233651__679.json | 25.0 | missing | missing | missing | |
| 992 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_234555__380 | 0 | 0.0 | 19.3989 | 0 | [361, 450] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_234555__380.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 993 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_234610__411 | 0 | 0.0 | 15.2164 | 0 | [1, 426] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapTask__1SHOT__20231224_234610__411.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 994 | Apple-MacBook-Pro-M1 | count_model_rows | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_215511__173 | 0 | 0.0 | 21.8057 | 2 | [361, 520] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_215511__173.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 995 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | InJulia | 1SHOT | true | false | 5 | 20231213_234528__334 | 0 | 0.0 | 22.1687 | 0 | [60, 642] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__InJulia__1SHOT__20231213_234528__334.json | 25.0 | missing | missing | missing | |
| 996 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_000558__910 | 0 | 0.0 | 8.12041 | 0 | [60, 267] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__InJulia__1SHOT__20231225_000558__910.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 997 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | InJulia | 1SHOT | true | false | 5 | 20231225_000606__669 | 0 | 0.0 | 8.19033 | 0 | [60, 270] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__InJulia__1SHOT__20231225_000606__669.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 998 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | InJulia | 1SHOT | true | true | 5 | 20231226_220208__595 | 0 | 0.0 | 5.94434 | 0 | [60, 194] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__InJulia__1SHOT__20231226_220208__595.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 999 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234505__551 | 0 | 0.0 | 6.46371 | 0 | [90, 186] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231213_234505__551.json | 50.0 | missing | missing | missing | |
| 1000 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_000541__141 | 1 | 0.0 | 6.13097 | 2 | [100, 191] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_000541__141.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1001 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000550__445 | 0 | 0.0 | 8.80394 | 0 | [100, 281] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_000550__445.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1002 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_220202__922 | 0 | 0.0 | 8.75688 | 0 | [100, 280] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_220202__922.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1003 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_234459__575 | 0 | 0.0 | 15.2193 | 0 | [201, 403] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234459__575.json | 0.0 | missing | missing | missing | |
| 1004 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000528__451 | 0 | 0.0 | 12.983 | 0 | [211, 204] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000528__451.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1005 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_000534__850 | 0 | 0.0 | 6.62346 | 0 | [211, 191] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000534__850.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1006 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_220153__362 | 0 | 0.0 | 14.932 | 0 | [211, 276] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_220153__362.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1007 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_234627__972 | 0 | 0.0 | 24.8557 | 0 | [11, 665] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234627__972.json | 50.0 | missing | missing | missing | |
| 1008 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_000642__879 | 0 | 0.0 | 10.3014 | 0 | [364, 283] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000642__879.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1009 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_000649__625 | 0 | 0.0 | 6.71047 | 0 | [364, 167] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000649__625.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1010 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_220223__537 | 5 | 0.0 | 8.25833 | 2 | [364, 218] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220223__537.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1011 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_234602__937 | 0 | 0.0 | 20.9137 | 0 | [361, 489] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapTask__1SHOT__20231213_234602__937.json | 0.0 | missing | missing | missing | |
| 1012 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_000622__251 | 0 | 0.0 | 7.673 | 0 | [361, 198] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_000622__251.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1013 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_000631__385 | 0 | 0.0 | 8.94708 | 0 | [361, 240] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_000631__385.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1014 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_220215__781 | 5 | 0.0 | 6.85407 | 2 | [361, 173] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_220215__781.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1015 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175048__165 | 0 | 0.0 | 8.9984 | 0 | [60, 174] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175048__165.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1016 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175057__429 | 5 | 0.0 | 8.3959 | 2 | [60, 162] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175057__429.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1017 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175116__348 | 0 | 0.0 | 19.3707 | 0 | [60, 380] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175116__348.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1018 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_175014__300 | 0 | 0.0 | 9.92116 | 0 | [100, 186] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175014__300.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1019 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_175029__470 | 0 | 0.0 | 14.7704 | 0 | [100, 282] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175029__470.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1020 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_175039__302 | 0 | 0.0 | 10.4279 | 0 | [100, 196] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175039__302.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1021 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_174934__857 | 0 | 0.0 | 22.0824 | 0 | [211, 393] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174934__857.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1022 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_174946__637 | 0 | 0.0 | 12.2047 | 0 | [211, 191] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_174946__637.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1023 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_175004__813 | 0 | 0.0 | 17.5841 | 0 | [211, 324] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175004__813.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1024 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_175215__667 | 0 | 0.0 | 16.5031 | 0 | [364, 283] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175215__667.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1025 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_175230__935 | 0 | 0.0 | 15.201 | 0 | [364, 258] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175230__935.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1026 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_175245__183 | 0 | 0.0 | 14.4849 | 0 | [364, 244] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175245__183.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1027 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_175132__706 | 0 | 0.0 | 15.204 | 0 | [361, 258] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175132__706.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1028 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_175143__130 | 1 | 0.0 | 11.2641 | 2 | [361, 182] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175143__130.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1029 | Apple-MacBook-Pro-M1 | count_model_rows | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_175158__941 | 0 | 0.0 | 15.0039 | 0 | [361, 254] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175158__941.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1030 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_215710__587 | 0 | 0.0 | 11.7271 | 0 | [1, 368] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_215710__587.json | 25.0 | missing | missing | missing | |
| 1031 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_215724__734 | 0 | 0.0 | 13.4437 | 0 | [1, 418] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_215724__734.json | 25.0 | missing | missing | missing | |
| 1032 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_003514__205 | 0 | 0.0 | 13.5121 | 0 | [56, 343] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_003514__205.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1033 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_003519__296 | 0 | 0.0 | 5.21206 | 0 | [56, 128] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_003519__296.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1034 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231226_221455__488 | 0 | 0.0 | 5.9337 | 0 | [56, 147] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_221455__488.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1035 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_215620__882 | 0 | 0.0 | 6.5107 | 0 | [1, 207] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_215620__882.json | 25.0 | missing | missing | missing | |
| 1036 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_215638__188 | 0 | 0.0 | 18.4253 | 0 | [1, 553] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_215638__188.json | 25.0 | missing | missing | missing | |
| 1037 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_003458__681 | 0 | 0.0 | 3.47413 | 0 | [98, 74] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003458__681.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1038 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_003501__646 | 0 | 0.0 | 2.23043 | 0 | [98, 41] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003501__646.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1039 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_221449__501 | 0 | 0.0 | 2.80355 | 0 | [98, 56] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_221449__501.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1040 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_215550__734 | 0 | 0.0 | 8.98351 | 0 | [1, 272] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_215550__734.json | 25.0 | missing | missing | missing | |
| 1041 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_215603__210 | 0 | 0.0 | 12.8947 | 0 | [1, 383] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_215603__210.json | 25.0 | missing | missing | missing | |
| 1042 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_003442__929 | 0 | 0.0 | 15.839 | 0 | [209, 230] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003442__929.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1043 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_003455__232 | 0 | 0.0 | 13.1729 | 0 | [209, 309] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003455__232.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1044 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_221446__636 | 0 | 0.0 | 10.7512 | 0 | [209, 110] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221446__636.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1045 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_215931__417 | 0 | 0.0 | 19.054 | 0 | [1, 526] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215931__417.json | 25.0 | missing | missing | missing | |
| 1046 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_215942__856 | 0 | 0.0 | 11.1712 | 0 | [1, 319] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215942__856.json | 25.0 | missing | missing | missing | |
| 1047 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_003600__981 | 0 | 0.0 | 11.5132 | 0 | [365, 242] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003600__981.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1048 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_003610__916 | 1 | 0.0 | 9.9214 | 1 | [365, 202] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003610__916.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1049 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_221537__619 | 0 | 0.0 | 20.1622 | 0 | [365, 455] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221537__619.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1050 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_215838__369 | 0 | 0.0 | 15.346 | 0 | [1, 431] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215838__369.json | 25.0 | missing | missing | missing | |
| 1051 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_215855__492 | 0 | 0.0 | 17.3169 | 0 | [1, 482] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215855__492.json | 25.0 | missing | missing | missing | |
| 1052 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_003537__791 | 1 | 0.0 | 9.64523 | 1 | [363, 195] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003537__791.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1053 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_003549__328 | 0 | 0.0 | 10.8939 | 0 | [363, 226] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003549__328.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1054 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_221516__380 | 0 | 0.0 | 21.4807 | 0 | [363, 487] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_221516__380.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1055 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_224733__816 | 0 | 0.0 | 10.4385 | 0 | [55, 335] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_224733__816.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1056 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231227_224746__855 | 0 | 0.0 | 12.2269 | 0 | [55, 392] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_224746__855.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1057 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_224759__931 | 0 | 0.0 | 13.6106 | 0 | [55, 436] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_224759__931.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1058 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_224816__762 | 0 | 0.0 | 16.7234 | 0 | [55, 535] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_224816__762.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1059 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_224823__819 | 0 | 0.0 | 6.50545 | 0 | [55, 206] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_224823__819.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1060 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224709__317 | 0 | 0.0 | 2.33556 | 0 | [97, 58] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_224709__317.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1061 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224712__222 | 1 | 0.0 | 2.20618 | 1 | [97, 54] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_224712__222.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1062 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_224714__157 | 0 | 0.0 | 2.29744 | 0 | [97, 56] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_224714__157.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1063 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224720__725 | 0 | 0.0 | 5.94903 | 0 | [97, 177] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_224720__725.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1064 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_224723__348 | 0 | 0.0 | 2.21717 | 0 | [97, 54] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_224723__348.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1065 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_224621__198 | 0 | 0.0 | 14.9847 | 2 | [208, 416] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_224621__198.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1066 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_224634__966 | 0 | 0.0 | 13.107 | 0 | [208, 390] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_224634__966.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1067 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_224649__826 | 0 | 0.0 | 15.097 | 2 | [208, 452] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_224649__826.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1068 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_224659__531 | 0 | 0.0 | 9.97539 | 0 | [208, 291] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_224659__531.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1069 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_224707__291 | 0 | 0.0 | 7.68919 | 0 | [208, 218] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_224707__291.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1070 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_224955__265 | 0 | 0.0 | 14.458 | 0 | [364, 401] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_224955__265.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1071 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_225012__671 | 0 | 0.0 | 16.3343 | 0 | [364, 458] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225012__671.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1072 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_225024__360 | 0 | 0.0 | 12.2992 | 0 | [364, 334] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225024__360.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1073 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_225041__117 | 0 | 0.0 | 16.6822 | 2 | [364, 469] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225041__117.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1074 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_225056__555 | 0 | 0.0 | 15.5444 | 0 | [364, 434] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225056__555.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1075 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224839__202 | 1 | 0.0 | 15.9437 | 2 | [362, 447] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_224839__202.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1076 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224858__743 | 0 | 0.0 | 18.4795 | 0 | [362, 522] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_224858__743.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1077 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224910__688 | 0 | 0.0 | 11.8807 | 0 | [362, 322] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_224910__688.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1078 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224924__452 | 0 | 0.0 | 14.1274 | 1 | [362, 391] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_224924__452.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1079 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_224941__486 | 0 | 0.0 | 15.7683 | 0 | [362, 441] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_224941__486.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1080 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_225244__808 | 0 | 0.0 | 10.8758 | 0 | [55, 274] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_225244__808.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1081 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_225303__729 | 0 | 0.0 | 19.3672 | 0 | [55, 490] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_225303__729.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1082 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_225319__405 | 1 | 0.0 | 15.6836 | 1 | [55, 397] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_225319__405.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1083 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_225337__693 | 1 | 0.0 | 17.4994 | 1 | [55, 443] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_225337__693.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1084 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_225355__678 | 0 | 0.0 | 18.3585 | 0 | [55, 465] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_225355__678.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1085 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225215__824 | 0 | 0.0 | 6.02155 | 0 | [97, 140] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_225215__824.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1086 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225221__304 | 0 | 0.0 | 5.92214 | 0 | [97, 137] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_225221__304.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1087 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225226__473 | 0 | 0.0 | 4.16356 | 0 | [97, 91] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_225226__473.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1088 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225229__854 | 0 | 0.0 | 3.30812 | 0 | [97, 69] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_225229__854.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1089 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225233__252 | 0 | 0.0 | 3.45889 | 0 | [97, 73] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_225233__252.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1090 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225110__628 | 1 | 0.0 | 13.1033 | 1 | [208, 285] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225110__628.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1091 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225123__779 | 0 | 0.0 | 13.5967 | 2 | [208, 318] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225123__779.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1092 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_225135__782 | 0 | 0.0 | 11.7692 | 0 | [208, 272] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225135__782.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1093 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225153__940 | 0 | 0.0 | 18.3907 | 0 | [208, 438] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225153__940.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1094 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_225209__972 | 0 | 0.0 | 15.6115 | 0 | [208, 369] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225209__972.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1095 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_225546__402 | 0 | 0.0 | 10.5431 | 0 | [364, 216] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225546__402.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1096 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_225608__907 | 1 | 0.0 | 21.2866 | 1 | [364, 480] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225608__907.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1097 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_225625__904 | 0 | 0.0 | 16.9681 | 0 | [364, 375] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225625__904.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1098 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_225636__970 | 0 | 0.0 | 11.8515 | 0 | [364, 249] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225636__970.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1099 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_225658__673 | 0 | 0.0 | 20.826 | 0 | [364, 469] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_225658__673.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1100 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_225415__375 | 0 | 0.0 | 19.505 | 0 | [362, 437] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_225415__375.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1101 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_225428__884 | 0 | 0.0 | 13.1764 | 0 | [362, 282] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_225428__884.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1102 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_225445__234 | 0 | 0.0 | 16.2774 | 0 | [362, 358] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_225445__234.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1103 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_225509__986 | 0 | 0.0 | 24.4287 | 0 | [362, 556] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_225509__986.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1104 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_225536__595 | 0 | 0.0 | 26.055 | 0 | [362, 595] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_225536__595.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1105 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_120244__330 | 0 | 0.0 | 15.4474 | 0 | [55, 284] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_120244__330.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1106 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_120301__205 | 0 | 0.0 | 16.591 | 0 | [55, 305] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_120301__205.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1107 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_221745__734 | 0 | 0.0 | 13.1733 | 0 | [55, 243] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_221745__734.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1108 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_120223__300 | 0 | 0.0 | 10.3157 | 0 | [97, 181] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120223__300.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1109 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_120228__757 | 0 | 0.0 | 5.80172 | 0 | [97, 96] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120228__757.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1110 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_221732__259 | 0 | 0.0 | 4.85736 | 0 | [97, 79] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_221732__259.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1111 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_120152__580 | 0 | 0.0 | 20.9227 | 0 | [208, 365] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120152__580.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1112 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_120212__478 | 0 | 0.0 | 19.9863 | 2 | [208, 348] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120212__478.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1113 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_221727__446 | 1 | 0.0 | 25.8075 | 1 | [208, 293] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221727__446.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1114 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_120435__893 | 0 | 0.0 | 22.4704 | 0 | [364, 372] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120435__893.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1115 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_120503__239 | 1 | 0.0 | 28.1081 | 1 | [364, 473] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120503__239.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1116 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_221851__107 | 0 | 0.0 | 28.1158 | 0 | [364, 477] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221851__107.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1117 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_120351__688 | 0 | 0.0 | 22.518 | 0 | [362, 373] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120351__688.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1118 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_120412__206 | 1 | 0.0 | 21.5563 | 1 | [362, 356] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120412__206.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1119 | Apple-MacBook-Pro-M1 | count_model_rows | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_221823__917 | 1 | 0.0 | 38.0978 | 1 | [362, 655] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_221823__917.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1120 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_103555__565 | 1 | 0.0 | 41.5998 | 1 | [61, 241] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_103555__565.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1121 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_103621__851 | 0 | 0.0 | 25.9109 | 0 | [61, 145] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_103621__851.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1122 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_103712__361 | 0 | 0.0 | 49.9946 | 0 | [61, 276] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_103712__361.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1123 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_142911__277 | 0 | 0.0 | 31.6657 | 0 | [61, 186] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_142911__277.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1124 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_142956__185 | 0 | 0.0 | 44.484 | 0 | [61, 264] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_142956__185.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1125 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_103449__127 | 0 | 0.0 | 13.5708 | 0 | [100, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_103449__127.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1126 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_103500__680 | 0 | 0.0 | 11.3195 | 0 | [100, 51] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_103500__680.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1127 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_103513__767 | 5 | 0.0 | 13.0491 | 2 | [100, 62] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_103513__767.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1128 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_142756__756 | 0 | 0.0 | 8.5501 | 0 | [100, 34] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_142756__756.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1129 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_142839__221 | 0 | 0.0 | 43.2631 | 1 | [100, 247] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_142839__221.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1130 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_103220__806 | 0 | 0.0 | 67.2272 | 0 | [211, 338] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_103220__806.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1131 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_103340__175 | 0 | 0.0 | 80.4441 | 0 | [211, 443] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_103340__175.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1132 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_103435__719 | 1 | 0.0 | 54.744 | 1 | [211, 299] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_103435__719.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1133 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_142731__384 | 0 | 0.0 | 59.987 | 0 | [211, 330] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_142731__384.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1134 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_142747__969 | 0 | 0.0 | 16.0041 | 0 | [211, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_142747__969.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1135 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_104055__291 | 0 | 0.0 | 46.4848 | 0 | [374, 222] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104055__291.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1136 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_104201__597 | 1 | 0.0 | 65.1742 | 1 | [374, 332] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104201__597.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1137 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_104309__915 | 0 | 0.0 | 68.0833 | 1 | [374, 349] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104309__915.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1138 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_143301__236 | 5 | 0.0 | 42.2901 | 2 | [374, 196] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_143301__236.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1139 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_143345__597 | 1 | 0.0 | 43.6346 | 1 | [374, 204] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_143345__597.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1140 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_103804__531 | 0 | 0.0 | 50.9949 | 0 | [372, 237] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_103804__531.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1141 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_103944__955 | 0 | 0.0 | 99.6307 | 0 | [372, 525] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_103944__955.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1142 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_104009__839 | 0 | 0.0 | 24.7735 | 0 | [372, 92] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_104009__839.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1143 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_143049__145 | 0 | 0.0 | 52.9837 | 0 | [372, 259] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_143049__145.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1144 | Apple-MacBook-Pro-M1 | count_model_rows | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_143219__477 | 0 | 0.0 | 89.837 | 0 | [372, 474] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_143219__477.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1145 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_220119__798 | 0 | 0.0 | 15.2935 | 0 | [1, 471] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_220119__798.json | 0.0 | missing | missing | missing | |
| 1146 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_220134__443 | 0 | 0.0 | 15.5754 | 0 | [1, 479] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_220134__443.json | 25.0 | missing | missing | missing | |
| 1147 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_003701__366 | 1 | 0.0 | 8.35118 | 1 | [64, 210] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_003701__366.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1148 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_003707__118 | 0 | 0.0 | 6.60528 | 0 | [64, 164] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_003707__118.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1149 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231226_221601__573 | 0 | 0.0 | 6.49109 | 0 | [64, 161] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_221601__573.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1150 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_220051__412 | 0 | 0.0 | 3.21347 | 0 | [1, 104] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_220051__412.json | 25.0 | missing | missing | missing | |
| 1151 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_220052__955 | 0 | 0.0 | 1.53799 | 0 | [1, 50] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_220052__955.json | 0.0 | missing | missing | missing | |
| 1152 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_003647__343 | 0 | 0.0 | 2.84394 | 0 | [106, 57] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003647__343.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1153 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_003652__574 | 0 | 0.0 | 4.68083 | 0 | [106, 105] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003652__574.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1154 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_221555__517 | 0 | 0.0 | 3.56213 | 0 | [106, 76] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_221555__517.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1155 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_220016__599 | 0 | 0.0 | 16.9062 | 0 | [1, 493] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220016__599.json | 25.0 | missing | missing | missing | |
| 1156 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_220032__674 | 0 | 0.0 | 15.9453 | 0 | [1, 467] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220032__674.json | 0.0 | missing | missing | missing | |
| 1157 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_003631__889 | 0 | 0.0 | 20.8787 | 0 | [217, 342] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003631__889.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1158 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_003644__389 | 0 | 0.0 | 13.2527 | 0 | [217, 310] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003644__389.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1159 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_221551__431 | 0 | 0.0 | 14.5061 | 0 | [217, 191] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221551__431.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1160 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_220354__445 | 0 | 0.0 | 23.6206 | 0 | [1, 641] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_220354__445.json | 25.0 | missing | missing | missing | |
| 1161 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_220356__920 | 0 | 0.0 | 1.89288 | 0 | [1, 56] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_220356__920.json | 0.0 | missing | missing | missing | |
| 1162 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_003750__204 | 0 | 0.0 | 10.7489 | 0 | [373, 222] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003750__204.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1163 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_003802__717 | 1 | 0.0 | 11.4122 | 1 | [373, 239] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003802__717.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1164 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_221623__540 | 0 | 0.0 | 11.9498 | 0 | [373, 252] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221623__540.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1165 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_220300__264 | 0 | 0.0 | 19.8496 | 0 | [1, 547] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_220300__264.json | 25.0 | missing | missing | missing | |
| 1166 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_220318__589 | 0 | 0.0 | 17.6938 | 0 | [1, 492] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_220318__589.json | 25.0 | missing | missing | missing | |
| 1167 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_003730__176 | 0 | 0.0 | 7.20025 | 0 | [371, 133] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003730__176.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1168 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_003740__579 | 0 | 0.0 | 9.22224 | 0 | [371, 184] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003740__579.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1169 | Apple-MacBook-Pro-M1 | count_model_rows | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_221611__745 | 0 | 0.0 | 9.22555 | 0 | [371, 184] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_221611__745.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1170 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231213_233751__850 | 0 | 0.0 | 13.8492 | 0 | [60, 416] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231213_233751__850.json | 25.0 | missing | missing | missing | |
| 1171 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231224_234738__461 | 1 | 0.0 | 5.86908 | 2 | [62, 188] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_234738__461.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1172 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231224_234750__672 | 0 | 0.0 | 11.0488 | 0 | [62, 359] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231224_234750__672.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1173 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231226_215558__283 | 0 | 0.0 | 9.13674 | 0 | [62, 293] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_215558__283.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1174 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_233737__854 | 0 | 0.0 | 5.21213 | 0 | [90, 148] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231213_233737__854.json | 50.0 | missing | missing | missing | |
| 1175 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231224_234726__349 | 0 | 0.0 | 8.27324 | 0 | [104, 257] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_234726__349.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1176 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_234732__537 | 0 | 0.0 | 6.50069 | 0 | [104, 199] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231224_234732__537.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1177 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_215549__585 | 0 | 0.0 | 3.75357 | 0 | [104, 107] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_215549__585.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1178 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_233731__334 | 0 | 0.0 | 16.0344 | 0 | [201, 425] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233731__334.json | 50.0 | missing | missing | missing | |
| 1179 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_234701__993 | 0 | 0.0 | 15.8352 | 0 | [215, 310] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234701__993.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1180 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_234717__729 | 1 | 0.0 | 15.7853 | 1 | [215, 480] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234717__729.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1181 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_215545__766 | 0 | 0.0 | 18.6394 | 2 | [215, 403] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215545__766.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1182 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_233838__799 | 0 | 0.0 | 16.4617 | 0 | [11, 452] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231213_233838__799.json | 0.0 | missing | missing | missing | |
| 1183 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_234827__593 | 0 | 0.0 | 8.45354 | 0 | [371, 219] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234827__593.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1184 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231224_234841__322 | 1 | 0.0 | 13.8755 | 2 | [371, 389] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231224_234841__322.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1185 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_215626__275 | 0 | 0.0 | 6.39491 | 0 | [371, 153] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_215626__275.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1186 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_233821__466 | 0 | 0.0 | 16.9021 | 0 | [361, 383] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231213_233821__466.json | 50.0 | missing | missing | missing | |
| 1187 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_234809__996 | 1 | 0.0 | 10.5946 | 1 | [369, 286] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_234809__996.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1188 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_234818__902 | 0 | 0.0 | 9.20994 | 0 | [369, 243] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231224_234818__902.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1189 | Apple-MacBook-Pro-M1 | count_model_rows | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_215620__454 | 0 | 0.0 | 21.1848 | 0 | [369, 610] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_215620__454.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1190 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231213_234827__613 | 0 | 0.0 | 13.1533 | 0 | [60, 398] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__InJulia__1SHOT__20231213_234827__613.json | 25.0 | missing | missing | missing | |
| 1191 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_001000__774 | 0 | 0.0 | 18.8294 | 0 | [63, 346] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__InJulia__1SHOT__20231225_001000__774.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1192 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_001003__328 | 0 | 0.0 | 2.84786 | 0 | [63, 44] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__InJulia__1SHOT__20231225_001003__328.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1193 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231226_220408__605 | 0 | 0.0 | 24.4842 | 0 | [63, 451] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__InJulia__1SHOT__20231226_220408__605.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1194 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234814__418 | 0 | 0.0 | 5.96785 | 0 | [90, 172] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231213_234814__418.json | 50.0 | missing | missing | missing | |
| 1195 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000929__442 | 0 | 0.0 | 10.2484 | 0 | [103, 176] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_000929__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1196 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000941__640 | 0 | 0.0 | 12.2004 | 0 | [103, 212] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_000941__640.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1197 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_220344__812 | 0 | 0.0 | 9.61763 | 0 | [103, 164] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_220344__812.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1198 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234808__337 | 0 | 0.0 | 13.8162 | 0 | [201, 365] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234808__337.json | 50.0 | missing | missing | missing | |
| 1199 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_000853__196 | 0 | 0.0 | 34.877 | 0 | [214, 429] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000853__196.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1200 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_000918__218 | 0 | 0.0 | 25.4935 | 0 | [214, 435] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000918__218.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1201 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_220334__988 | 0 | 0.0 | 37.6627 | 0 | [214, 494] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_220334__988.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1202 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_234916__427 | 0 | 0.0 | 14.678 | 0 | [11, 406] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234916__427.json | 50.0 | missing | missing | missing | |
| 1203 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_001152__922 | 0 | 0.0 | 27.7576 | 0 | [367, 442] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_001152__922.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1204 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_001222__951 | 1 | 0.0 | 30.1904 | 1 | [367, 484] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_001222__951.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1205 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_220510__486 | 0 | 0.0 | 42.6368 | 0 | [367, 696] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220510__486.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1206 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_234901__646 | 0 | 0.0 | 20.1089 | 0 | [361, 469] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231213_234901__646.json | 0.0 | missing | missing | missing | |
| 1207 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_001102__594 | 0 | 0.0 | 14.1329 | 0 | [364, 202] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_001102__594.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1208 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_001124__452 | 0 | 0.0 | 21.9011 | 0 | [364, 340] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_001124__452.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1209 | Apple-MacBook-Pro-M1 | count_model_rows | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_220427__666 | 0 | 0.0 | 19.4366 | 0 | [364, 298] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_220427__666.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1210 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_220545__201 | 0 | 0.0 | 18.8485 | 0 | [1, 571] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_220545__201.json | 25.0 | missing | missing | missing | |
| 1211 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231219_220603__798 | 0 | 0.0 | 17.7743 | 0 | [1, 541] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_220603__798.json | 0.0 | missing | missing | missing | |
| 1212 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_003845__635 | 0 | 0.0 | 0.848594 | 0 | [54, 28] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_003845__635.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1213 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_003904__702 | 0 | 0.0 | 19.0177 | 0 | [54, 722] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_003904__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1214 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_221654__796 | 0 | 0.0 | 1.36434 | 0 | [54, 49] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231226_221654__796.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1215 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_220501__368 | 0 | 0.0 | 13.3985 | 0 | [1, 412] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_220501__368.json | 25.0 | missing | missing | missing | |
| 1216 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_220512__562 | 0 | 0.0 | 11.2781 | 0 | [1, 350] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_220512__562.json | 25.0 | missing | missing | missing | |
| 1217 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_003843__883 | 0 | 0.0 | 1.80896 | 0 | [91, 63] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_003843__883.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1218 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_003844__290 | 0 | 0.0 | 1.16057 | 0 | [91, 36] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_003844__290.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1219 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_221653__798 | 0 | 0.0 | 25.2228 | 0 | [91, 925] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_221653__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1220 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_220427__284 | 0 | 0.0 | 13.9989 | 0 | [1, 414] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220427__284.json | 25.0 | missing | missing | missing | |
| 1221 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_220438__328 | 0 | 0.0 | 10.3601 | 0 | [1, 311] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220438__328.json | 25.0 | missing | missing | missing | |
| 1222 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_003826__758 | 0 | 0.0 | 23.9141 | 0 | [199, 728] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003826__758.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1223 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_003841__401 | 0 | 0.0 | 15.1561 | 0 | [199, 550] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003841__401.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1224 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_221627__986 | 0 | 0.0 | 4.64606 | 0 | [199, 22] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221627__986.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1225 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_220813__341 | 0 | 0.0 | 21.4744 | 0 | [1, 587] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_220813__341.json | 25.0 | missing | missing | missing | |
| 1226 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_220830__598 | 0 | 0.0 | 16.9847 | 0 | [1, 473] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_220830__598.json | 25.0 | missing | missing | missing | |
| 1227 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_004013__520 | 0 | 0.0 | 16.5171 | 0 | [343, 565] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004013__520.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1228 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_004051__685 | 0 | 0.0 | 37.4903 | 0 | [343, 1243] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004051__685.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1229 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_221701__304 | 0 | 0.0 | 1.77331 | 0 | [343, 26] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221701__304.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1230 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_220716__118 | 0 | 0.0 | 17.038 | 0 | [1, 475] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_220716__118.json | 25.0 | missing | missing | missing | |
| 1231 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_220734__791 | 0 | 0.0 | 17.3187 | 0 | [1, 482] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_220734__791.json | 25.0 | missing | missing | missing | |
| 1232 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_003939__570 | 0 | 0.0 | 4.58815 | 0 | [340, 134] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_003939__570.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1233 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_003957__543 | 0 | 0.0 | 17.3381 | 0 | [340, 594] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_003957__543.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1234 | Apple-MacBook-Pro-M1 | count_model_rows | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_221659__760 | 0 | 0.0 | 5.00695 | 0 | [340, 150] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_221659__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1235 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231213_234951__780 | 0 | 0.0 | 14.1914 | 0 | [60, 428] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231213_234951__780.json | 50.0 | missing | missing | missing | |
| 1236 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231225_001522__114 | 0 | 0.0 | 37.2775 | 0 | [71, 287] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_001522__114.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1237 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_001556__112 | 5 | 0.0 | 33.66 | 2 | [71, 258] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_001556__112.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1238 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231226_220726__983 | 0 | 0.0 | 39.4345 | 0 | [71, 302] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_220726__983.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1239 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234937__779 | 0 | 0.0 | 6.45624 | 0 | [90, 187] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231213_234937__779.json | 50.0 | missing | missing | missing | |
| 1240 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_001414__539 | 5 | 0.0 | 20.3467 | 2 | [111, 145] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_001414__539.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1241 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_001445__481 | 1 | 0.0 | 31.2445 | 2 | [111, 233] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_001445__481.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1242 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_220647__752 | 0 | 0.0 | 29.3822 | 0 | [111, 217] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_220647__752.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1243 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234930__404 | 0 | 0.0 | 13.9218 | 0 | [201, 368] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234930__404.json | 50.0 | missing | missing | missing | |
| 1244 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_001300__292 | 0 | 0.0 | 38.2029 | 0 | [222, 89] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_001300__292.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1245 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_001353__891 | 0 | 0.0 | 53.0855 | 0 | [222, 387] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_001353__891.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1246 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_220617__839 | 0 | 0.0 | 67.0474 | 0 | [222, 331] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_220617__839.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1247 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_235043__441 | 0 | 0.0 | 18.3364 | 0 | [11, 502] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_235043__441.json | 25.0 | missing | missing | missing | |
| 1248 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_001920__254 | 0 | 0.0 | 29.7647 | 0 | [375, 175] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_001920__254.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1249 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_002021__816 | 5 | 0.0 | 60.9093 | 2 | [375, 416] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_002021__816.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1250 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_220920__197 | 0 | 0.0 | 46.0586 | 0 | [375, 302] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220920__197.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1251 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_235025__758 | 0 | 0.0 | 19.4897 | 1 | [361, 453] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231213_235025__758.json | 62.5 | missing | missing | missing | |
| 1252 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_001758__851 | 0 | 0.0 | 48.0835 | 0 | [372, 318] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_001758__851.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1253 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_001850__653 | 1 | 0.0 | 52.3759 | 2 | [372, 351] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_001850__653.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1254 | Apple-MacBook-Pro-M1 | count_model_rows | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_220834__609 | 5 | 0.0 | 67.6811 | 2 | [372, 468] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_220834__609.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1255 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_215250__842 | 0 | 0.0 | 13.6183 | 0 | [1, 423] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_215250__842.json | 25.0 | missing | missing | missing | |
| 1256 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_215304__791 | 0 | 0.0 | 14.3747 | 0 | [1, 445] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_215304__791.json | 0.0 | missing | missing | missing | |
| 1257 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_003223__891 | 0 | 0.0 | 14.5014 | 0 | [64, 247] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_003223__891.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1258 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_003241__830 | 0 | 0.0 | 17.4855 | 0 | [64, 299] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_003241__830.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1259 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_221404__411 | 1 | 0.0 | 18.7704 | 1 | [64, 321] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_221404__411.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1260 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_215214__725 | 0 | 0.0 | 6.50602 | 0 | [1, 207] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_215214__725.json | 25.0 | missing | missing | missing | |
| 1261 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_215224__781 | 0 | 0.0 | 9.68038 | 0 | [1, 303] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_215224__781.json | 25.0 | missing | missing | missing | |
| 1262 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_003155__191 | 0 | 0.0 | 10.2821 | 0 | [106, 164] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003155__191.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1263 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_003209__809 | 0 | 0.0 | 13.4768 | 0 | [106, 220] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_003209__809.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1264 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_221346__670 | 0 | 0.0 | 17.3357 | 0 | [106, 287] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_221346__670.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1265 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_215149__670 | 0 | 0.0 | 17.2388 | 0 | [1, 502] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_215149__670.json | 25.0 | missing | missing | missing | |
| 1266 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_215200__649 | 0 | 0.0 | 11.2459 | 0 | [1, 337] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_215200__649.json | 0.0 | missing | missing | missing | |
| 1267 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_003125__396 | 0 | 0.0 | 32.5478 | 0 | [217, 372] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003125__396.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1268 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_003145__593 | 0 | 0.0 | 19.656 | 0 | [217, 310] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_003145__593.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1269 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_221328__229 | 0 | 0.0 | 26.4703 | 0 | [217, 280] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_221328__229.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1270 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_215510__192 | 0 | 0.0 | 17.1119 | 0 | [1, 476] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215510__192.json | 25.0 | missing | missing | missing | |
| 1271 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_215531__423 | 0 | 0.0 | 21.0286 | 0 | [1, 576] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_215531__423.json | 25.0 | missing | missing | missing | |
| 1272 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_003410__816 | 0 | 0.0 | 22.0189 | 0 | [373, 324] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003410__816.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1273 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_003426__309 | 0 | 0.0 | 15.8834 | 0 | [373, 221] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_003426__309.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1274 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_221435__790 | 0 | 0.0 | 15.9167 | 0 | [373, 222] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_221435__790.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1275 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_215415__768 | 0 | 0.0 | 14.9531 | 0 | [1, 420] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215415__768.json | 25.0 | missing | missing | missing | |
| 1276 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_215433__631 | 0 | 0.0 | 17.5486 | 0 | [1, 488] | 0.5.0-DEV | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_215433__631.json | 25.0 | missing | missing | missing | |
| 1277 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_003331__890 | 0 | 0.0 | 18.0692 | 0 | [371, 258] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003331__890.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1278 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_003348__523 | 0 | 0.0 | 17.0537 | 0 | [371, 241] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_003348__523.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1279 | Apple-MacBook-Pro-M1 | count_model_rows | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_221419__647 | 0 | 0.0 | 14.9166 | 0 | [371, 205] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_221419__647.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1280 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231213_234706__781 | 0 | 0.0 | 17.9475 | 0 | [60, 535] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__InJulia__1SHOT__20231213_234706__781.json | 25.0 | missing | missing | missing | |
| 1281 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231225_000727__542 | 0 | 0.0 | 5.68033 | 0 | [66, 323] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_000727__542.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1282 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_000733__386 | 0 | 0.0 | 6.23973 | 0 | [66, 355] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_000733__386.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1283 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231226_220239__331 | 0 | 0.0 | 4.96201 | 0 | [66, 283] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_220239__331.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1284 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_234648__510 | 0 | 0.0 | 6.58183 | 1 | [90, 190] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231213_234648__510.json | 62.5 | missing | missing | missing | |
| 1285 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000713__171 | 0 | 0.0 | 3.22453 | 0 | [103, 176] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_000713__171.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1286 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_000721__931 | 0 | 0.0 | 8.22005 | 0 | [103, 454] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_000721__931.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1287 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231226_220234__317 | 0 | 0.0 | 2.72004 | 0 | [103, 146] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_220234__317.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1288 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234642__419 | 0 | 0.0 | 14.3645 | 0 | [201, 380] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234642__419.json | 50.0 | missing | missing | missing | |
| 1289 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_000701__904 | 0 | 0.0 | 12.6651 | 0 | [208, 509] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000701__904.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1290 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_000710__877 | 0 | 0.0 | 8.11393 | 0 | [208, 421] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_000710__877.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1291 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_220232__564 | 0 | 0.0 | 8.61171 | 0 | [208, 308] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_220232__564.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1292 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231213_234754__880 | 0 | 0.0 | 17.5133 | 0 | [11, 481] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234754__880.json | 25.0 | missing | missing | missing | |
| 1293 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_000807__901 | 0 | 0.0 | 13.6964 | 0 | [353, 651] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000807__901.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1294 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_000818__196 | 0 | 0.0 | 11.1766 | 0 | [353, 530] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000818__196.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1295 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_220256__488 | 0 | 0.0 | 7.17056 | 0 | [353, 330] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220256__488.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1296 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_234736__919 | 0 | 0.0 | 19.0238 | 0 | [361, 441] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231213_234736__919.json | 50.0 | missing | missing | missing | |
| 1297 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_000745__524 | 0 | 0.0 | 6.11028 | 0 | [351, 279] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_000745__524.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1298 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_000753__143 | 0 | 0.0 | 7.929 | 0 | [351, 373] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_000753__143.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1299 | Apple-MacBook-Pro-M1 | count_model_rows | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_220249__455 | 0 | 0.0 | 9.12219 | 0 | [351, 432] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_220249__455.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1300 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | InJulia | 1SHOT | false | false | 5 | 20231213_233916__219 | 0 | 0.0 | 9.82109 | 0 | [60, 297] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__InJulia__1SHOT__20231213_233916__219.json | 0.0 | missing | missing | missing | |
| 1301 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231224_234934__594 | 1 | 0.0 | 7.26865 | 1 | [64, 234] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_234934__594.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1302 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231224_234943__161 | 1 | 0.0 | 8.84993 | 1 | [64, 287] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__InJulia__1SHOT__20231224_234943__161.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1303 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231226_215713__312 | 1 | 0.0 | 14.6576 | 1 | [64, 476] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_215713__312.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1304 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_233907__141 | 0 | 0.0 | 8.93196 | 0 | [90, 261] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231213_233907__141.json | 50.0 | missing | missing | missing | |
| 1305 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_234923__562 | 0 | 0.0 | 5.87122 | 0 | [106, 178] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_234923__562.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1306 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_234927__174 | 0 | 0.0 | 4.15057 | 0 | [106, 120] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231224_234927__174.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1307 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_215658__746 | 1 | 0.0 | 12.1175 | 2 | [106, 383] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_215658__746.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1308 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_233858__375 | 0 | 0.0 | 19.672 | 0 | [201, 523] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231213_233858__375.json | 25.0 | missing | missing | missing | |
| 1309 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_234901__192 | 1 | 0.0 | 20.4737 | 1 | [217, 450] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234901__192.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1310 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231224_234917__614 | 0 | 0.0 | 15.2863 | 0 | [217, 463] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231224_234917__614.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1311 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_215646__175 | 0 | 0.0 | 19.577 | 0 | [217, 438] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215646__175.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1312 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_234001__338 | 1 | 0.0 | 12.0508 | 1 | [11, 316] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234001__338.json | 67.5 | missing | missing | missing | |
| 1313 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_235037__550 | 0 | 0.0 | 7.94165 | 0 | [373, 201] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_235037__550.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1314 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231224_235105__166 | 0 | 0.0 | 27.8619 | 0 | [373, 806] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231224_235105__166.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1315 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_215729__677 | 0 | 0.0 | 8.53079 | 0 | [373, 221] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_215729__677.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1316 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_233948__180 | 0 | 0.0 | 21.1216 | 0 | [361, 491] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231213_233948__180.json | 50.0 | missing | missing | missing | |
| 1317 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_235015__430 | 0 | 0.0 | 20.9847 | 0 | [371, 604] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_235015__430.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1318 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231224_235029__842 | 0 | 0.0 | 13.3196 | 0 | [371, 371] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231224_235029__842.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1319 | Apple-MacBook-Pro-M1 | count_model_rows | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_215721__942 | 1 | 0.0 | 7.73376 | 1 | [371, 196] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_215721__942.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1320 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231213_234035__395 | 0 | 0.0 | 15.5552 | 0 | [60, 467] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__InJulia__1SHOT__20231213_234035__395.json | 25.0 | missing | missing | missing | |
| 1321 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231224_235418__521 | 0 | 0.0 | 45.7727 | 0 | [61, 348] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_235418__521.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1322 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231224_235533__135 | 1 | 0.0 | 75.1738 | 2 | [61, 570] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__InJulia__1SHOT__20231224_235533__135.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1323 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231226_215927__917 | 0 | 0.0 | 44.8418 | 0 | [61, 342] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_215927__917.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1324 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_234019__569 | 0 | 0.0 | 2.3851 | 0 | [90, 57] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231213_234019__569.json | 0.0 | missing | missing | missing | |
| 1325 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_235303__947 | 0 | 0.0 | 20.1016 | 0 | [100, 138] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_235303__947.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1326 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231224_235332__979 | 0 | 0.0 | 28.9181 | 0 | [100, 207] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231224_235332__979.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1327 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_215842__317 | 0 | 0.0 | 31.3383 | 0 | [100, 226] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_215842__317.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1328 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_234017__224 | 0 | 0.0 | 16.0019 | 0 | [201, 424] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231213_234017__224.json | 50.0 | missing | missing | missing | |
| 1329 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231224_235155__921 | 0 | 0.0 | 50.6081 | 0 | [211, 167] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_235155__921.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1330 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231224_235243__249 | 1 | 0.0 | 47.1463 | 2 | [211, 328] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231224_235243__249.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1331 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_215811__590 | 0 | 0.0 | 41.2627 | 0 | [211, 109] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_215811__590.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1332 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_234126__382 | 0 | 0.0 | 21.2596 | 0 | [11, 574] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231213_234126__382.json | 50.0 | missing | missing | missing | |
| 1333 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_000004__412 | 0 | 0.0 | 73.3634 | 0 | [374, 488] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000004__412.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1334 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_000043__370 | 0 | 0.0 | 39.1117 | 0 | [374, 237] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_000043__370.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1335 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_220035__337 | 0 | 0.0 | 14.4288 | 0 | [374, 51] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_220035__337.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1336 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_234105__321 | 0 | 0.0 | 20.1889 | 0 | [361, 470] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231213_234105__321.json | 50.0 | missing | missing | missing | |
| 1337 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_235741__837 | 0 | 0.0 | 36.5412 | 0 | [372, 218] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_235741__837.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1338 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231224_235850__987 | 0 | 0.0 | 69.642 | 0 | [372, 462] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231224_235850__987.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1339 | Apple-MacBook-Pro-M1 | count_model_rows | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_220021__784 | 1 | 0.0 | 53.8136 | 1 | [372, 348] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/count_model_rows/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_220021__784.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 1340 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231214_000034__401 | 0 | 0.0 | 24.5438 | 0 | [155, 675] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_000034__401.json | 25.0 | missing | missing | missing | |
| 1341 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_010714__313 | 0 | 0.0 | 23.4561 | 0 | [163, 408] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_010714__313.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1342 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_010733__632 | 0 | 0.0 | 19.4932 | 0 | [163, 336] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_010733__632.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1343 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231226_223025__594 | 4 | 0.0 | 27.1754 | 4 | [163, 477] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_223025__594.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1344 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_000009__479 | 0 | 0.0 | 15.5025 | 0 | [184, 420] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_000009__479.json | 0.0 | missing | missing | missing | |
| 1345 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_010632__632 | 2 | 0.0 | 19.0882 | 3 | [201, 323] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_010632__632.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1346 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_010650__127 | 2 | 0.0 | 18.1504 | 3 | [201, 306] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_010650__127.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1347 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_222958__800 | 2 | 0.0 | 19.7778 | 4 | [201, 337] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_222958__800.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1348 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_235953__951 | 0 | 0.0 | 13.0769 | 0 | [280, 315] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231213_235953__951.json | 25.0 | missing | missing | missing | |
| 1349 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_010554__101 | 2 | 0.0 | 37.7664 | 3 | [298, 458] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_010554__101.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1350 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_010613__606 | 2 | 0.0 | 18.4311 | 3 | [298, 292] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_010613__606.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1351 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_222938__203 | 0 | 0.0 | 24.8798 | 0 | [298, 230] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_222938__203.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1352 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_000141__760 | 0 | 0.0 | 25.9064 | 0 | [11, 673] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_000141__760.json | 0.0 | missing | missing | missing | |
| 1353 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_010926__885 | 1 | 0.0 | 25.6768 | 4 | [466, 388] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_010926__885.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1354 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_010955__455 | 0 | 0.0 | 28.7018 | 0 | [466, 440] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_010955__455.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1355 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_223123__495 | 0 | 0.0 | 31.0197 | 0 | [466, 481] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_223123__495.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1356 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_000115__902 | 0 | 0.0 | 14.9184 | 0 | [455, 288] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_000115__902.json | 0.0 | missing | missing | missing | |
| 1357 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_010836__796 | 0 | 0.0 | 15.2939 | 0 | [463, 207] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_010836__796.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1358 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_010901__177 | 2 | 0.0 | 24.6206 | 5 | [463, 370] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_010901__177.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1359 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_223052__451 | 2 | 0.0 | 26.4461 | 4 | [463, 403] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_223052__451.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1360 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231214_000248__517 | 0 | 0.0 | 25.2563 | 0 | [155, 693] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_000248__517.json | 25.0 | missing | missing | missing | |
| 1361 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_011051__687 | 0 | 0.0 | 8.50558 | 0 | [137, 138] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_011051__687.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1362 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_011119__710 | 0 | 0.0 | 28.5189 | 0 | [137, 506] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_011119__710.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1363 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_000223__421 | 0 | 0.0 | 19.2196 | 0 | [184, 521] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_000223__421.json | 0.0 | missing | missing | missing | |
| 1364 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_011028__911 | 0 | 0.0 | 9.83538 | 0 | [138, 163] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_011028__911.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1365 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_011042__308 | 0 | 0.0 | 14.3757 | 0 | [138, 248] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_011042__308.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1366 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_000204__552 | 0 | 0.0 | 23.0983 | 0 | [280, 582] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_000204__552.json | 50.0 | missing | missing | missing | |
| 1367 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_011012__587 | 0 | 0.0 | 16.8307 | 0 | [173, 97] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011012__587.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1368 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_011018__225 | 0 | 0.0 | 5.9643 | 0 | [173, 84] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011018__225.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1369 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_000403__460 | 0 | 0.0 | 27.5707 | 5 | [11, 712] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_000403__460.json | 75.0 | missing | missing | missing | |
| 1370 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_011222__614 | 0 | 0.0 | 13.7206 | 0 | [155, 235] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011222__614.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1371 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_011224__150 | 0 | 0.0 | 1.83038 | 0 | [155, 10] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011224__150.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1372 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_000336__922 | 0 | 0.0 | 29.6447 | 5 | [455, 660] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_000336__922.json | 75.0 | missing | missing | missing | |
| 1373 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_011157__310 | 0 | 0.0 | 6.32042 | 0 | [152, 96] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_011157__310.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1374 | Apple-MacBook-Pro-M1 | weather_data_analyzer | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_011209__851 | 0 | 0.0 | 11.995 | 0 | [152, 203] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_011209__851.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1375 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_221134__941 | 0 | 0.0 | 28.3444 | 0 | [1, 801] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_221134__941.json | 25.0 | missing | missing | missing | |
| 1376 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_221156__764 | 0 | 0.0 | 22.1811 | 0 | [1, 643] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_221156__764.json | 25.0 | missing | missing | missing | |
| 1377 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_013715__311 | 2 | 0.0 | 76.708 | 3 | [163, 446] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_013715__311.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1378 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_013820__735 | 2 | 0.0 | 64.5219 | 3 | [163, 373] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_013820__735.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1379 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_224506__860 | 4 | 0.0 | 61.8166 | 5 | [163, 358] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_224506__860.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1380 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_221035__970 | 0 | 0.0 | 19.8304 | 0 | [1, 574] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_221035__970.json | 25.0 | missing | missing | missing | |
| 1381 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_221047__530 | 0 | 0.0 | 12.7103 | 0 | [1, 380] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_221047__530.json | 25.0 | missing | missing | missing | |
| 1382 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_013454__633 | 2 | 0.0 | 48.0312 | 5 | [204, 266] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_013454__633.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1383 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_013558__938 | 4 | 0.0 | 63.4225 | 5 | [204, 360] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_013558__938.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1384 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_224403__937 | 0 | 0.0 | 70.0414 | 3 | [204, 401] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_224403__937.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1385 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_220931__302 | 0 | 0.0 | 32.1639 | 0 | [1, 862] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220931__302.json | 25.0 | missing | missing | missing | |
| 1386 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_220959__913 | 0 | 0.0 | 27.8832 | 0 | [1, 760] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_220959__913.json | 25.0 | missing | missing | missing | |
| 1387 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_013307__768 | 0 | 0.0 | 99.7698 | 5 | [299, 391] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_013307__768.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1388 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_013406__964 | 1 | 0.0 | 58.5001 | 1 | [299, 313] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_013406__964.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1389 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_224253__454 | 0 | 0.0 | 101.699 | 5 | [299, 434] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_224253__454.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1390 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_221525__241 | 0 | 0.0 | 24.0577 | 0 | [1, 635] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_221525__241.json | 0.0 | missing | missing | missing | |
| 1391 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_221601__257 | 0 | 0.0 | 36.3165 | 0 | [1, 919] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_221601__257.json | 25.0 | missing | missing | missing | |
| 1392 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_014431__546 | 0 | 0.0 | 64.5248 | 0 | [492, 314] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_014431__546.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1393 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_014547__605 | 0 | 0.0 | 75.0892 | 0 | [492, 376] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_014547__605.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1394 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_224808__426 | 4 | 0.0 | 79.4219 | 5 | [492, 403] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_224808__426.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1395 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_221412__643 | 0 | 0.0 | 11.2138 | 0 | [1, 311] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_221412__643.json | 0.0 | missing | missing | missing | |
| 1396 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_221436__323 | 0 | 0.0 | 24.8654 | 0 | [1, 655] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_221436__323.json | 0.0 | missing | missing | missing | |
| 1397 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_014210__198 | 0 | 0.0 | 78.9252 | 0 | [490, 394] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_014210__198.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1398 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_014326__611 | 3 | 0.0 | 75.4009 | 4 | [490, 378] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_014326__611.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1399 | Apple-MacBook-Pro-M1 | weather_data_analyzer | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_224648__510 | 0 | 0.0 | 101.847 | 4 | [490, 534] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_224648__510.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1400 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_225705__399 | 0 | 0.0 | 10.7213 | 0 | [156, 398] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_225705__399.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1401 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_104415__884 | 0 | 0.0 | 14.1265 | 0 | [156, 522] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_104415__884.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1402 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_104431__810 | 0 | 0.0 | 15.9238 | 0 | [156, 587] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_104431__810.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1403 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_104529__735 | 0 | 0.0 | 58.0708 | 0 | [156, 1904] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_104529__735.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1404 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_225654__505 | 0 | 0.0 | 10.0286 | 0 | [193, 363] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_225654__505.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1405 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_104348__706 | 0 | 0.0 | 8.95364 | 0 | [193, 323] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_104348__706.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1406 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_104356__348 | 0 | 0.0 | 7.53589 | 0 | [193, 269] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_104356__348.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1407 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_104401__485 | 0 | 0.0 | 5.2168 | 0 | [193, 181] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_104401__485.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1408 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_225644__403 | 0 | 0.0 | 10.2809 | 0 | [277, 230] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225644__403.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1409 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_104325__361 | 0 | 0.0 | 15.158 | 0 | [277, 406] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_104325__361.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1410 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_104330__502 | 0 | 0.0 | 4.85695 | 0 | [277, 155] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_104330__502.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1411 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_104339__870 | 0 | 0.0 | 9.72986 | 0 | [277, 337] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_104339__870.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1412 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_225723__349 | 0 | 0.0 | 7.11506 | 0 | [445, 209] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225723__349.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1413 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_104605__761 | 0 | 0.0 | 5.71987 | 0 | [445, 158] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104605__761.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1414 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_104616__179 | 0 | 0.0 | 10.7518 | 0 | [445, 339] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104616__179.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1415 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_104625__908 | 0 | 0.0 | 8.86537 | 0 | [445, 272] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_104625__908.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1416 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_225715__373 | 0 | 0.0 | 10.4532 | 0 | [442, 329] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_225715__373.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1417 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_104539__737 | 0 | 0.0 | 9.62227 | 0 | [442, 299] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_104539__737.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1418 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_104549__779 | 0 | 0.0 | 10.1183 | 0 | [442, 317] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_104549__779.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1419 | Apple-MacBook-Pro-M1 | weather_data_analyzer | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_104559__637 | 0 | 0.0 | 10.4792 | 0 | [442, 330] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_104559__637.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1420 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | InJulia | 1SHOT | true | true | 5 | 20231213_235150__112 | 0 | 0.0 | 19.786 | 0 | [155, 549] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__InJulia__1SHOT__20231213_235150__112.json | 50.0 | missing | missing | missing | |
| 1421 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | InJulia | 1SHOT | true | true | 5 | 20231225_004306__666 | 0 | 0.0 | 21.8207 | 0 | [155, 604] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__InJulia__1SHOT__20231225_004306__666.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1422 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_004338__466 | 0 | 0.0 | 31.2168 | 0 | [1, 870] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__InJulia__1SHOT__20231225_004338__466.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1423 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | InJulia | 1SHOT | true | false | 5 | 20231226_222057__326 | 0 | 0.0 | 30.0519 | 0 | [155, 825] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__InJulia__1SHOT__20231226_222057__326.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1424 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231213_235131__651 | 0 | 0.0 | 16.2652 | 0 | [184, 442] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertAsk__1SHOT__20231213_235131__651.json | 0.0 | missing | missing | missing | |
| 1425 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_004227__396 | 1 | 0.0 | 21.3075 | 1 | [184, 576] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_004227__396.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1426 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_004245__378 | 0 | 0.0 | 17.4589 | 0 | [1, 509] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_004245__378.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1427 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_222027__999 | 1 | 0.0 | 20.0148 | 5 | [184, 550] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_222027__999.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1428 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231213_235114__377 | 0 | 0.0 | 30.6795 | 0 | [280, 772] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231213_235114__377.json | 50.0 | missing | missing | missing | |
| 1429 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_004134__852 | 0 | 0.0 | 43.374 | 0 | [298, 945] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004134__852.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1430 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_004205__409 | 0 | 0.0 | 31.0204 | 5 | [1, 834] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004205__409.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1431 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_222006__796 | 0 | 0.0 | 31.5484 | 0 | [298, 679] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_222006__796.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1432 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_235252__665 | 0 | 0.0 | 14.2854 | 0 | [11, 384] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231213_235252__665.json | 0.0 | missing | missing | missing | |
| 1433 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_004549__744 | 0 | 0.0 | 22.7765 | 0 | [11, 597] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004549__744.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1434 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_004600__952 | 0 | 0.0 | 10.5604 | 0 | [1, 293] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004600__952.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1435 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_222147__482 | 0 | 0.0 | 19.1878 | 4 | [11, 515] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_222147__482.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1436 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231213_235238__344 | 0 | 0.0 | 27.6486 | 0 | [455, 612] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapTask__1SHOT__20231213_235238__344.json | 50.0 | missing | missing | missing | |
| 1437 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_004453__964 | 0 | 0.0 | 31.7742 | 0 | [455, 710] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_004453__964.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1438 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_004526__961 | 0 | 0.0 | 32.5024 | 5 | [1, 832] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_004526__961.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1439 | Apple-MacBook-Pro-M1 | weather_data_analyzer | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_222128__364 | 0 | 0.0 | 30.859 | 0 | [455, 697] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_222128__364.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1440 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | InJulia | 1SHOT | true | false | 5 | 20231214_000501__874 | 0 | 0.0 | 22.0551 | 0 | [155, 610] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__InJulia__1SHOT__20231214_000501__874.json | 25.0 | missing | missing | missing | |
| 1441 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_011327__487 | 1 | 0.0 | 10.2432 | 5 | [155, 322] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__InJulia__1SHOT__20231225_011327__487.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1442 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_011338__600 | 3 | 0.0 | 11.1629 | 5 | [155, 352] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__InJulia__1SHOT__20231225_011338__600.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1443 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | InJulia | 1SHOT | true | true | 5 | 20231226_223209__713 | 1 | 0.0 | 13.97 | 5 | [155, 443] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__InJulia__1SHOT__20231226_223209__713.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1444 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_000439__595 | 0 | 0.0 | 15.7073 | 0 | [184, 426] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_000439__595.json | 50.0 | missing | missing | missing | |
| 1445 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_011308__125 | 0 | 0.0 | 11.6662 | 0 | [194, 358] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_011308__125.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1446 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_011317__260 | 0 | 0.0 | 8.34874 | 0 | [194, 249] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_011317__260.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1447 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_223155__713 | 3 | 0.0 | 12.9277 | 5 | [194, 399] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_223155__713.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1448 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_000423__706 | 0 | 0.0 | 19.893 | 0 | [280, 499] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_000423__706.json | 0.0 | missing | missing | missing | |
| 1449 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_011243__119 | 0 | 0.0 | 18.5083 | 5 | [290, 366] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011243__119.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1450 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_011256__748 | 3 | 0.0 | 13.202 | 5 | [290, 387] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011256__748.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1451 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_223142__499 | 0 | 0.0 | 18.6142 | 0 | [290, 377] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_223142__499.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1452 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_000617__576 | 0 | 0.0 | 16.8756 | 0 | [11, 450] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_000617__576.json | 25.0 | missing | missing | missing | |
| 1453 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_011437__581 | 3 | 0.0 | 13.553 | 5 | [458, 366] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011437__581.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1454 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_011450__105 | 3 | 0.0 | 12.6229 | 5 | [458, 338] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011450__105.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1455 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_223237__834 | 1 | 0.0 | 16.3832 | 5 | [458, 453] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_223237__834.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1456 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_000600__234 | 0 | 0.0 | 38.225 | 0 | [455, 859] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_000600__234.json | 25.0 | missing | missing | missing | |
| 1457 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_011411__736 | 3 | 0.0 | 12.3947 | 5 | [455, 331] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_011411__736.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1458 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_011424__237 | 0 | 0.0 | 12.8764 | 0 | [455, 345] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_011424__237.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1459 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_223220__349 | 3 | 0.0 | 10.8898 | 5 | [455, 284] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_223220__349.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1460 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175508__517 | 4 | 0.0 | 17.7621 | 5 | [155, 336] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175508__517.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1461 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175526__836 | 3 | 0.0 | 17.3546 | 5 | [155, 328] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175526__836.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1462 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_175545__220 | 4 | 0.0 | 19.0476 | 5 | [155, 361] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175545__220.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1463 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_175409__234 | 3 | 0.0 | 14.9109 | 5 | [194, 273] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175409__234.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1464 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_175431__218 | 0 | 0.0 | 21.5599 | 0 | [194, 402] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175431__218.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1465 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_175450__899 | 3 | 0.0 | 19.521 | 5 | [194, 363] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175450__899.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1466 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_175308__703 | 0 | 0.0 | 23.2991 | 5 | [290, 422] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175308__703.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1467 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_175333__489 | 0 | 0.0 | 24.9473 | 5 | [290, 453] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175333__489.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1468 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_175354__767 | 0 | 0.0 | 20.9069 | 5 | [290, 376] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175354__767.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1469 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_175713__593 | 3 | 0.0 | 24.2353 | 5 | [458, 416] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175713__593.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1470 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_175726__496 | 3 | 0.0 | 12.3569 | 5 | [458, 191] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175726__496.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1471 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_175737__918 | 3 | 0.0 | 11.1968 | 5 | [458, 169] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_175737__918.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1472 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_175608__761 | 3 | 0.0 | 22.6319 | 5 | [455, 386] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175608__761.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1473 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_175628__751 | 3 | 0.0 | 19.9783 | 5 | [455, 336] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175628__751.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1474 | Apple-MacBook-Pro-M1 | weather_data_analyzer | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_175649__155 | 4 | 0.0 | 21.2001 | 5 | [455, 359] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175649__155.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1475 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_222501__871 | 0 | 0.0 | 26.7759 | 0 | [1, 762] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_222501__871.json | 25.0 | missing | missing | missing | |
| 1476 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_222521__419 | 0 | 0.0 | 20.1413 | 0 | [1, 589] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_222521__419.json | 25.0 | missing | missing | missing | |
| 1477 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_015119__356 | 0 | 0.0 | 14.397 | 0 | [157, 350] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_015119__356.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1478 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_015136__517 | 0 | 0.0 | 17.3861 | 0 | [157, 425] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_015136__517.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1479 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_225055__303 | 3 | 0.0 | 16.6469 | 5 | [157, 406] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_225055__303.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1480 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_222400__187 | 0 | 0.0 | 12.9663 | 0 | [1, 387] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_222400__187.json | 25.0 | missing | missing | missing | |
| 1481 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_222416__116 | 0 | 0.0 | 15.4509 | 0 | [1, 456] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_222416__116.json | 25.0 | missing | missing | missing | |
| 1482 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_015056__685 | 1 | 0.0 | 16.2827 | 1 | [198, 388] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_015056__685.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1483 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_015104__152 | 0 | 0.0 | 7.98344 | 1 | [198, 177] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_015104__152.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1484 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_225039__625 | 0 | 0.0 | 7.07708 | 0 | [198, 154] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_225039__625.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1485 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_222315__711 | 0 | 0.0 | 23.9404 | 0 | [1, 662] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_222315__711.json | 25.0 | missing | missing | missing | |
| 1486 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_222334__354 | 0 | 0.0 | 18.5604 | 0 | [1, 525] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_222334__354.json | 25.0 | missing | missing | missing | |
| 1487 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_015037__424 | 0 | 0.0 | 22.4648 | 0 | [294, 381] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015037__424.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1488 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_015040__838 | 0 | 0.0 | 2.92654 | 0 | [294, 33] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015040__838.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1489 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_225031__143 | 2 | 0.0 | 28.3601 | 5 | [294, 536] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225031__143.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1490 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_222740__427 | 0 | 0.0 | 21.8417 | 0 | [1, 581] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_222740__427.json | 0.0 | missing | missing | missing | |
| 1491 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_222801__837 | 0 | 0.0 | 20.3479 | 0 | [1, 544] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_222801__837.json | 0.0 | missing | missing | missing | |
| 1492 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_015307__199 | 0 | 0.0 | 21.105 | 0 | [465, 460] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015307__199.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1493 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_015329__392 | 0 | 0.0 | 21.8849 | 0 | [465, 479] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015329__392.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1494 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_225136__679 | 0 | 0.0 | 18.8599 | 0 | [465, 406] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225136__679.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1495 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_222712__892 | 0 | 0.0 | 10.7321 | 0 | [1, 298] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_222712__892.json | 0.0 | missing | missing | missing | |
| 1496 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_222714__343 | 0 | 0.0 | 2.15599 | 0 | [1, 62] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_222714__343.json | 0.0 | missing | missing | missing | |
| 1497 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_015224__953 | 0 | 0.0 | 17.205 | 0 | [463, 366] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_015224__953.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1498 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_015246__442 | 0 | 0.0 | 21.7891 | 5 | [463, 477] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_015246__442.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1499 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_225117__780 | 0 | 0.0 | 22.1063 | 0 | [463, 484] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_225117__780.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1500 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_225911__715 | 1 | 0.0 | 15.9333 | 1 | [156, 490] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_225911__715.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1501 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_225922__293 | 0 | 0.0 | 10.3067 | 0 | [156, 312] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_225922__293.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1502 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_225933__478 | 4 | 0.0 | 11.7945 | 4 | [156, 360] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_225933__478.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1503 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_225945__894 | 1 | 0.0 | 11.1702 | 1 | [156, 340] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_225945__894.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1504 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_225958__799 | 1 | 0.0 | 12.4873 | 1 | [156, 382] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_225958__799.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1505 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225830__509 | 0 | 0.0 | 5.70508 | 0 | [197, 150] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_225830__509.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1506 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225837__540 | 1 | 0.0 | 6.69868 | 1 | [197, 186] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_225837__540.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1507 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225842__232 | 1 | 0.0 | 5.73666 | 5 | [197, 155] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_225842__232.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1508 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_225849__573 | 0 | 0.0 | 6.53278 | 0 | [197, 181] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_225849__573.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1509 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_225855__594 | 1 | 0.0 | 6.15435 | 1 | [197, 169] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_225855__594.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1510 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225714__831 | 1 | 0.0 | 16.0615 | 1 | [293, 439] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225714__831.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1511 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225732__527 | 1 | 0.0 | 18.3042 | 1 | [293, 531] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225732__527.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1512 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_225748__666 | 0 | 0.0 | 15.7005 | 0 | [293, 452] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225748__666.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1513 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225807__160 | 2 | 0.0 | 19.1978 | 4 | [293, 559] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225807__160.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1514 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_225823__367 | 2 | 0.0 | 14.8073 | 5 | [293, 425] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_225823__367.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1515 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230132__800 | 3 | 0.0 | 13.1829 | 4 | [464, 344] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230132__800.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1516 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230148__510 | 1 | 0.0 | 15.2176 | 1 | [464, 405] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230148__510.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1517 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230159__638 | 1 | 0.0 | 11.3118 | 1 | [464, 286] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230159__638.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1518 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_230216__981 | 0 | 0.0 | 16.3178 | 0 | [464, 438] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230216__981.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1519 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230231__704 | 1 | 0.0 | 15.0326 | 1 | [464, 399] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230231__704.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1520 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230020__408 | 1 | 0.0 | 22.1162 | 1 | [462, 608] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_230020__408.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1521 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230033__631 | 1 | 0.0 | 13.1995 | 1 | [462, 344] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_230033__631.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1522 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230046__365 | 1 | 0.0 | 12.3823 | 1 | [462, 319] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_230046__365.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1523 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_230102__391 | 0 | 0.0 | 15.8378 | 0 | [462, 423] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_230102__391.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1524 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_230119__920 | 0 | 0.0 | 17.2961 | 0 | [462, 467] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_230119__920.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1525 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_230518__139 | 0 | 0.0 | 21.1633 | 0 | [156, 517] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_230518__139.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1526 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_230537__924 | 0 | 0.0 | 18.8878 | 0 | [156, 461] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_230537__924.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1527 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_230551__192 | 0 | 0.0 | 13.7787 | 0 | [156, 333] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_230551__192.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1528 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_230606__197 | 0 | 0.0 | 15.2959 | 0 | [156, 371] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_230606__197.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1529 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_230621__315 | 0 | 0.0 | 14.5072 | 0 | [156, 351] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_230621__315.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1530 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_230421__676 | 2 | 0.0 | 8.71803 | 4 | [197, 195] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_230421__676.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1531 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_230430__104 | 1 | 0.0 | 8.81139 | 1 | [197, 197] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_230430__104.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1532 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_230437__216 | 4 | 0.0 | 7.28573 | 5 | [197, 158] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_230437__216.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1533 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_230446__228 | 0 | 0.0 | 8.34264 | 0 | [197, 185] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_230446__228.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1534 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_230457__688 | 1 | 0.0 | 10.7599 | 1 | [197, 247] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_230457__688.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1535 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_230258__150 | 0 | 0.0 | 26.8734 | 5 | [293, 608] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_230258__150.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1536 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_230311__192 | 2 | 0.0 | 13.5903 | 5 | [293, 303] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_230311__192.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1537 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_230325__758 | 0 | 0.0 | 13.4033 | 0 | [293, 298] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_230325__758.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1538 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_230345__699 | 0 | 0.0 | 20.2852 | 0 | [293, 468] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_230345__699.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1539 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_230412__400 | 0 | 0.0 | 26.8961 | 0 | [293, 628] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_230412__400.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1540 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_230835__730 | 0 | 0.0 | 22.9244 | 0 | [464, 501] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230835__730.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1541 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230856__890 | 3 | 0.0 | 19.9997 | 4 | [464, 432] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230856__890.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1542 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230914__401 | 2 | 0.0 | 18.1412 | 2 | [464, 387] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230914__401.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1543 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_230940__871 | 0 | 0.0 | 26.4502 | 0 | [464, 585] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_230940__871.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1544 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_231009__823 | 0 | 0.0 | 28.495 | 0 | [464, 633] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231009__823.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1545 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230638__871 | 1 | 0.0 | 17.1058 | 1 | [462, 362] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_230638__871.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1546 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230655__127 | 1 | 0.0 | 16.9649 | 1 | [462, 358] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_230655__127.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1547 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230724__740 | 5 | 0.0 | 28.5364 | 5 | [462, 634] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_230724__740.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1548 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230747__174 | 5 | 0.0 | 22.4799 | 5 | [462, 491] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_230747__174.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1549 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_230812__219 | 4 | 0.0 | 24.7566 | 5 | [462, 545] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_230812__219.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1550 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_120640__917 | 1 | 0.0 | 18.2328 | 1 | [156, 322] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_120640__917.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1551 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_120701__309 | 0 | 0.0 | 20.5566 | 0 | [156, 366] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_120701__309.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1552 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_225530__424 | 1 | 0.0 | 19.8629 | 1 | [156, 356] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_225530__424.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1553 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_120612__402 | 0 | 0.0 | 8.81605 | 0 | [197, 142] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120612__402.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1554 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_120622__860 | 0 | 0.0 | 9.82826 | 0 | [197, 159] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120622__860.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1555 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_225509__756 | 1 | 0.0 | 11.4169 | 1 | [197, 192] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_225509__756.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1556 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_120535__349 | 0 | 0.0 | 32.0354 | 0 | [293, 553] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120535__349.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1557 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_120603__955 | 0 | 0.0 | 27.7579 | 0 | [293, 476] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120603__955.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1558 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_225458__605 | 0 | 0.0 | 34.0564 | 0 | [293, 436] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225458__605.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1559 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_120903__197 | 2 | 0.0 | 21.4305 | 5 | [464, 338] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120903__197.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1560 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_120931__938 | 1 | 0.0 | 28.3009 | 1 | [464, 455] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_120931__938.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1561 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_225633__644 | 1 | 0.0 | 23.733 | 1 | [464, 385] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225633__644.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1562 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_120809__433 | 2 | 0.0 | 24.9299 | 4 | [462, 396] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120809__433.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1563 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_120842__828 | 0 | 0.0 | 32.2973 | 0 | [462, 522] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_120842__828.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1564 | Apple-MacBook-Pro-M1 | weather_data_analyzer | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_225609__676 | 1 | 0.0 | 38.9794 | 1 | [462, 655] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_225609__676.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1565 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_105356__462 | 3 | 0.0 | 61.9122 | 5 | [159, 340] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_105356__462.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1566 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_105529__388 | 1 | 0.0 | 93.0929 | 1 | [159, 507] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_105529__388.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1567 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_105614__471 | 2 | 0.0 | 45.0402 | 5 | [159, 250] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_105614__471.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1568 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_143856__832 | 0 | 0.0 | 56.1772 | 0 | [159, 318] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_143856__832.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1569 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_144018__126 | 3 | 0.0 | 81.2281 | 4 | [159, 467] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_144018__126.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1570 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_105135__108 | 1 | 0.0 | 59.1567 | 2 | [198, 327] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_105135__108.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1571 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_105223__823 | 0 | 0.0 | 47.5428 | 0 | [198, 257] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_105223__823.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1572 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_105254__284 | 0 | 0.0 | 30.5051 | 0 | [198, 154] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_105254__284.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1573 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_143710__840 | 0 | 0.0 | 39.9977 | 0 | [198, 208] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_143710__840.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1574 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_143800__642 | 0 | 0.0 | 50.4785 | 0 | [198, 274] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_143800__642.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1575 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_104819__420 | 0 | 0.0 | 113.663 | 3 | [290, 598] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_104819__420.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1576 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_104945__837 | 1 | 0.0 | 86.1574 | 1 | [290, 469] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_104945__837.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1577 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_105036__260 | 0 | 0.0 | 50.6996 | 0 | [290, 260] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_105036__260.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1578 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_143459__836 | 1 | 0.0 | 73.7959 | 1 | [290, 395] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_143459__836.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1579 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_143628__841 | 2 | 0.0 | 87.8332 | 5 | [290, 477] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_143628__841.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1580 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_110048__247 | 0 | 0.0 | 12.6012 | 0 | [472, 4] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110048__247.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1581 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_110146__675 | 0 | 0.0 | 58.8394 | 0 | [472, 278] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110146__675.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1582 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_110159__431 | 0 | 0.0 | 12.7126 | 0 | [472, 5] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110159__431.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1583 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_144503__859 | 0 | 0.0 | 122.708 | 0 | [472, 641] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_144503__859.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1584 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_144516__720 | 0 | 0.0 | 12.7586 | 0 | [472, 4] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_144516__720.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1585 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_105756__507 | 1 | 0.0 | 100.452 | 4 | [470, 516] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_105756__507.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1586 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_105946__369 | 4 | 0.0 | 110.003 | 5 | [470, 571] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_105946__369.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1587 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_110035__188 | 1 | 0.0 | 48.3991 | 5 | [470, 216] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_110035__188.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1588 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_144048__872 | 0 | 0.0 | 30.1751 | 0 | [470, 108] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_144048__872.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1589 | Apple-MacBook-Pro-M1 | weather_data_analyzer | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_144300__964 | 3 | 0.0 | 131.459 | 5 | [470, 690] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_144300__964.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1590 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_223056__316 | 0 | 0.0 | 19.3416 | 0 | [1, 568] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_223056__316.json | 0.0 | missing | missing | missing | |
| 1591 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_223115__857 | 0 | 0.0 | 18.6061 | 0 | [1, 548] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_223115__857.json | 25.0 | missing | missing | missing | |
| 1592 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_015450__735 | 0 | 0.0 | 12.9547 | 4 | [165, 309] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_015450__735.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1593 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_015502__141 | 2 | 0.0 | 12.2239 | 3 | [165, 290] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_015502__141.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1594 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_225220__814 | 0 | 0.0 | 15.5551 | 1 | [165, 374] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_225220__814.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1595 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_223004__444 | 0 | 0.0 | 20.6822 | 0 | [1, 597] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_223004__444.json | 0.0 | missing | missing | missing | |
| 1596 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_223017__781 | 0 | 0.0 | 13.1026 | 0 | [1, 391] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_223017__781.json | 25.0 | missing | missing | missing | |
| 1597 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_015420__723 | 0 | 0.0 | 9.12353 | 4 | [206, 206] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_015420__723.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1598 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_015437__182 | 0 | 0.0 | 16.203 | 0 | [206, 385] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_015437__182.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1599 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_225205__403 | 0 | 0.0 | 10.0376 | 4 | [206, 229] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_225205__403.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1600 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_222906__163 | 0 | 0.0 | 24.1287 | 0 | [1, 667] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_222906__163.json | 0.0 | missing | missing | missing | |
| 1601 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_222928__143 | 0 | 0.0 | 22.021 | 0 | [1, 614] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_222928__143.json | 25.0 | missing | missing | missing | |
| 1602 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_015351__830 | 0 | 0.0 | 21.9191 | 1 | [302, 353] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015351__830.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1603 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_015411__150 | 0 | 0.0 | 19.1149 | 1 | [302, 440] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015411__150.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1604 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_225154__419 | 1 | 0.0 | 17.8548 | 1 | [302, 261] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225154__419.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1605 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_223448__912 | 0 | 0.0 | 8.52838 | 0 | [1, 239] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_223448__912.json | 0.0 | missing | missing | missing | |
| 1606 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_223528__140 | 0 | 0.0 | 40.0495 | 0 | [1, 1002] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_223528__140.json | 25.0 | missing | missing | missing | |
| 1607 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_015618__110 | 0 | 0.0 | 11.5996 | 0 | [473, 228] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015618__110.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1608 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_015639__114 | 4 | 0.0 | 21.0126 | 5 | [473, 457] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015639__114.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1609 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_225301__735 | 0 | 0.0 | 23.8841 | 0 | [473, 525] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225301__735.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1610 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_223336__637 | 0 | 0.0 | 22.2286 | 0 | [1, 591] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_223336__637.json | 0.0 | missing | missing | missing | |
| 1611 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_223437__875 | 0 | 0.0 | 60.9232 | 0 | [1, 1436] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_223437__875.json | 25.0 | missing | missing | missing | |
| 1612 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_015551__447 | 2 | 0.0 | 15.7896 | 5 | [471, 331] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_015551__447.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1613 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_015607__441 | 4 | 0.0 | 14.6284 | 5 | [471, 302] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_015607__441.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1614 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_225237__866 | 2 | 0.0 | 15.655 | 5 | [471, 327] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_225237__866.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1615 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231213_235345__285 | 0 | 0.0 | 16.754 | 0 | [155, 466] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231213_235345__285.json | 25.0 | missing | missing | missing | |
| 1616 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_004718__660 | 4 | 0.0 | 13.35 | 5 | [163, 411] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_004718__660.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1617 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_004728__700 | 1 | 0.0 | 9.08286 | 1 | [163, 269] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_004728__700.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1618 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231226_222238__996 | 5 | 0.0 | 13.9987 | 5 | [163, 433] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_222238__996.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1619 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_235328__807 | 0 | 0.0 | 16.798 | 3 | [184, 455] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231213_235328__807.json | 65.0 | missing | missing | missing | |
| 1620 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_004654__131 | 4 | 0.0 | 11.3195 | 5 | [204, 339] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_004654__131.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1621 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_004704__359 | 2 | 0.0 | 10.4713 | 5 | [204, 312] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_004704__359.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1622 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_222224__606 | 3 | 0.0 | 12.8794 | 5 | [204, 390] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_222224__606.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1623 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231213_235311__722 | 0 | 0.0 | 18.7638 | 0 | [280, 469] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231213_235311__722.json | 0.0 | missing | missing | missing | |
| 1624 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_004624__360 | 3 | 0.0 | 24.4981 | 5 | [300, 563] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004624__360.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1625 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_004642__263 | 0 | 0.0 | 15.8772 | 4 | [300, 459] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004642__263.json | 70.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1626 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_222210__357 | 4 | 0.0 | 22.5858 | 5 | [300, 518] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_222210__357.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1627 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231213_235451__682 | 0 | 0.0 | 17.9051 | 0 | [11, 476] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231213_235451__682.json | 50.0 | missing | missing | missing | |
| 1628 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_004854__827 | 1 | 0.0 | 18.6227 | 2 | [471, 513] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004854__827.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1629 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_004907__397 | 0 | 0.0 | 13.1066 | 0 | [471, 344] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_004907__397.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1630 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_222311__482 | 3 | 0.0 | 18.4136 | 5 | [471, 508] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_222311__482.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1631 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231213_235433__829 | 0 | 0.0 | 24.1298 | 0 | [455, 525] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231213_235433__829.json | 25.0 | missing | missing | missing | |
| 1632 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_004815__305 | 0 | 0.0 | 16.797 | 1 | [469, 458] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_004815__305.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1633 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_004834__688 | 3 | 0.0 | 19.6766 | 5 | [469, 545] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_004834__688.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1634 | Apple-MacBook-Pro-M1 | weather_data_analyzer | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_222252__805 | 4 | 0.0 | 12.9699 | 5 | [469, 339] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_222252__805.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1635 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231214_001026__197 | 0 | 0.0 | 22.0328 | 0 | [155, 609] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__InJulia__1SHOT__20231214_001026__197.json | 0.0 | missing | missing | missing | |
| 1636 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_011805__320 | 0 | 0.0 | 4.34159 | 0 | [158, 58] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__InJulia__1SHOT__20231225_011805__320.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1637 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_011808__694 | 0 | 0.0 | 3.93989 | 0 | [158, 50] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__InJulia__1SHOT__20231225_011808__694.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1638 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231226_223358__776 | 0 | 0.0 | 2.81436 | 0 | [158, 29] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__InJulia__1SHOT__20231226_223358__776.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1639 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_001004__106 | 0 | 0.0 | 19.445 | 0 | [184, 527] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_001004__106.json | 50.0 | missing | missing | missing | |
| 1640 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_011744__360 | 0 | 0.0 | 20.5312 | 0 | [197, 348] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_011744__360.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1641 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_011800__700 | 0 | 0.0 | 16.5377 | 0 | [197, 275] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_011800__700.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1642 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_223355__621 | 0 | 0.0 | 19.9785 | 0 | [197, 339] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_223355__621.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1643 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_000944__714 | 0 | 0.0 | 24.4597 | 0 | [280, 617] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_000944__714.json | 0.0 | missing | missing | missing | |
| 1644 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_011657__568 | 0 | 0.0 | 39.3985 | 0 | [293, 492] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011657__568.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1645 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_011723__741 | 1 | 0.0 | 26.1396 | 1 | [293, 428] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011723__741.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1646 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_223335__733 | 0 | 0.0 | 16.8404 | 0 | [293, 96] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_223335__733.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1647 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_001129__334 | 0 | 0.0 | 20.6436 | 1 | [11, 545] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_001129__334.json | 55.0 | missing | missing | missing | |
| 1648 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_012022__436 | 0 | 0.0 | 32.792 | 1 | [461, 507] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_012022__436.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1649 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_012111__658 | 1 | 0.0 | 48.8575 | 1 | [461, 771] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_012111__658.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1650 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_223527__444 | 0 | 0.0 | 62.3047 | 0 | [461, 988] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_223527__444.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1651 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_001108__866 | 1 | 0.0 | 20.5363 | 1 | [455, 435] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_001108__866.json | 60.0 | missing | missing | missing | |
| 1652 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_011907__697 | 0 | 0.0 | 51.067 | 0 | [458, 808] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_011907__697.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1653 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_011949__181 | 0 | 0.0 | 41.9574 | 0 | [458, 660] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_011949__181.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1654 | Apple-MacBook-Pro-M1 | weather_data_analyzer | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_223425__680 | 0 | 0.0 | 27.0679 | 0 | [458, 412] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_223425__680.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1655 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_223809__387 | 0 | 0.0 | 22.5997 | 0 | [1, 654] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_223809__387.json | 25.0 | missing | missing | missing | |
| 1656 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_223824__221 | 0 | 0.0 | 15.101 | 0 | [1, 452] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_223824__221.json | 25.0 | missing | missing | missing | |
| 1657 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_015713__952 | 0 | 0.0 | 10.9863 | 0 | [150, 410] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_015713__952.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1658 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_015735__753 | 0 | 0.0 | 21.713 | 0 | [150, 793] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_015735__753.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1659 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_225402__343 | 0 | 0.0 | 21.0231 | 0 | [150, 767] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231226_225402__343.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1660 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_223711__411 | 0 | 0.0 | 17.5883 | 0 | [1, 514] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_223711__411.json | 25.0 | missing | missing | missing | |
| 1661 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_223724__622 | 0 | 0.0 | 12.6206 | 0 | [1, 377] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_223724__622.json | 25.0 | missing | missing | missing | |
| 1662 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_015656__129 | 0 | 0.0 | 5.43524 | 0 | [187, 193] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_015656__129.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1663 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_015702__895 | 0 | 0.0 | 6.33638 | 0 | [187, 228] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_015702__895.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1664 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_225341__647 | 0 | 0.0 | 28.3147 | 0 | [187, 1001] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_225341__647.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1665 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_223616__499 | 0 | 0.0 | 23.0583 | 0 | [1, 640] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_223616__499.json | 25.0 | missing | missing | missing | |
| 1666 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_223638__405 | 0 | 0.0 | 21.2794 | 0 | [1, 595] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_223638__405.json | 25.0 | missing | missing | missing | |
| 1667 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_015644__467 | 0 | 0.0 | 4.59122 | 0 | [271, 1] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015644__467.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1668 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_015650__598 | 0 | 0.0 | 6.32847 | 0 | [271, 211] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015650__598.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1669 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_225312__592 | 0 | 0.0 | 11.8317 | 0 | [271, 287] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225312__592.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1670 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_224130__138 | 0 | 0.0 | 21.2263 | 0 | [1, 566] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_224130__138.json | 25.0 | missing | missing | missing | |
| 1671 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_224150__477 | 0 | 0.0 | 19.8011 | 0 | [1, 531] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_224150__477.json | 25.0 | missing | missing | missing | |
| 1672 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_015846__818 | 0 | 0.0 | 1.50748 | 0 | [439, 1] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015846__818.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1673 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_015852__132 | 0 | 0.0 | 6.08236 | 0 | [439, 172] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015852__132.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1674 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_225424__224 | 0 | 0.0 | 11.8003 | 0 | [439, 377] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225424__224.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1675 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_224028__828 | 0 | 0.0 | 30.0108 | 0 | [1, 776] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_224028__828.json | 25.0 | missing | missing | missing | |
| 1676 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_224051__875 | 0 | 0.0 | 22.793 | 0 | [1, 605] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_224051__875.json | 25.0 | missing | missing | missing | |
| 1677 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_015817__408 | 0 | 0.0 | 11.4897 | 0 | [436, 367] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_015817__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1678 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_015844__875 | 0 | 0.0 | 26.6979 | 0 | [436, 877] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_015844__875.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1679 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_225412__773 | 0 | 0.0 | 10.3172 | 0 | [436, 325] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_225412__773.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1680 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231214_001231__249 | 0 | 0.0 | 20.1494 | 0 | [155, 560] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_001231__249.json | 50.0 | missing | missing | missing | |
| 1681 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_012606__238 | 4 | 0.0 | 49.7197 | 5 | [166, 368] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_012606__238.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1682 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_012645__152 | 4 | 0.0 | 38.5566 | 5 | [166, 280] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_012645__152.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1683 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231226_223858__545 | 3 | 0.0 | 54.1394 | 4 | [166, 399] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_223858__545.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1684 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_001211__932 | 0 | 0.0 | 18.3204 | 0 | [184, 497] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_001211__932.json | 0.0 | missing | missing | missing | |
| 1685 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_012433__789 | 4 | 0.0 | 60.8031 | 5 | [205, 448] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_012433__789.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1686 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_012516__362 | 3 | 0.0 | 43.1207 | 4 | [205, 310] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_012516__362.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1687 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_223804__214 | 4 | 0.0 | 72.3195 | 5 | [205, 536] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_223804__214.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1688 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_001153__399 | 0 | 0.0 | 23.8805 | 0 | [280, 601] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_001153__399.json | 25.0 | missing | missing | missing | |
| 1689 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_012234__304 | 0 | 0.0 | 83.1678 | 0 | [301, 425] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_012234__304.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1690 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_012332__150 | 3 | 0.0 | 57.5829 | 4 | [301, 404] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_012332__150.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1691 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_223651__233 | 4 | 0.0 | 84.0566 | 5 | [301, 448] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_223651__233.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1692 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_001401__286 | 0 | 0.0 | 22.1583 | 0 | [11, 583] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_001401__286.json | 0.0 | missing | missing | missing | |
| 1693 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_013049__737 | 0 | 0.0 | 31.7041 | 0 | [469, 172] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_013049__737.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1694 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_013126__846 | 0 | 0.0 | 36.8812 | 0 | [469, 212] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_013126__846.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1695 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_224111__706 | 1 | 0.0 | 79.2699 | 4 | [469, 534] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_224111__706.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1696 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_001338__274 | 0 | 0.0 | 36.4981 | 0 | [455, 820] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_001338__274.json | 0.0 | missing | missing | missing | |
| 1697 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_012925__629 | 0 | 0.0 | 58.2451 | 0 | [466, 376] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_012925__629.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1698 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_013018__301 | 0 | 0.0 | 52.6804 | 0 | [466, 334] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_013018__301.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1699 | Apple-MacBook-Pro-M1 | weather_data_analyzer | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_223952__402 | 3 | 0.0 | 53.4697 | 4 | [466, 339] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_223952__402.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1700 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_221841__402 | 0 | 0.0 | 22.3348 | 0 | [1, 647] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_221841__402.json | 0.0 | missing | missing | missing | |
| 1701 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_221856__778 | 0 | 0.0 | 15.4998 | 0 | [1, 463] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_221856__778.json | 25.0 | missing | missing | missing | |
| 1702 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_014735__130 | 1 | 0.0 | 24.5535 | 1 | [165, 400] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_014735__130.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1703 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_014806__329 | 3 | 0.0 | 30.5615 | 5 | [165, 501] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_014806__329.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1704 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_224921__441 | 4 | 0.0 | 17.9342 | 5 | [165, 287] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_224921__441.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1705 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_221744__173 | 0 | 0.0 | 17.0151 | 0 | [1, 499] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_221744__173.json | 25.0 | missing | missing | missing | |
| 1706 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_221800__388 | 0 | 0.0 | 15.0874 | 0 | [1, 446] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_221800__388.json | 25.0 | missing | missing | missing | |
| 1707 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_014655__745 | 4 | 0.0 | 12.078 | 5 | [206, 181] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_014655__745.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1708 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_014710__494 | 4 | 0.0 | 15.1446 | 5 | [206, 234] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_014710__494.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1709 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_224903__941 | 1 | 0.0 | 18.4789 | 1 | [206, 291] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_224903__941.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1710 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_221646__913 | 0 | 0.0 | 22.4057 | 0 | [1, 624] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_221646__913.json | 25.0 | missing | missing | missing | |
| 1711 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_221708__151 | 0 | 0.0 | 22.0768 | 0 | [1, 613] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_221708__151.json | 0.0 | missing | missing | missing | |
| 1712 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_014624__265 | 1 | 0.0 | 36.6974 | 1 | [302, 419] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_014624__265.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1713 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_014643__691 | 0 | 0.0 | 18.9424 | 0 | [302, 283] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_014643__691.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1714 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_224844__302 | 0 | 0.0 | 36.518 | 0 | [302, 433] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_224844__302.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1715 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_222224__474 | 0 | 0.0 | 27.3763 | 0 | [1, 714] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_222224__474.json | 25.0 | missing | missing | missing | |
| 1716 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_222233__220 | 0 | 0.0 | 8.84217 | 0 | [1, 248] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_222233__220.json | 0.0 | missing | missing | missing | |
| 1717 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_014957__445 | 1 | 0.0 | 26.0856 | 1 | [473, 374] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_014957__445.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1718 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_015014__882 | 4 | 0.0 | 17.0582 | 5 | [473, 225] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_015014__882.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1719 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_225002__870 | 4 | 0.0 | 16.9178 | 5 | [473, 223] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225002__870.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1720 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_222107__446 | 0 | 0.0 | 32.4677 | 0 | [1, 833] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_222107__446.json | 25.0 | missing | missing | missing | |
| 1721 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_222128__437 | 0 | 0.0 | 21.2514 | 0 | [1, 567] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_222128__437.json | 25.0 | missing | missing | missing | |
| 1722 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_014907__847 | 0 | 0.0 | 23.5198 | 0 | [471, 332] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_014907__847.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1723 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_014930__759 | 4 | 0.0 | 23.614 | 5 | [471, 334] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_014930__759.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1724 | Apple-MacBook-Pro-M1 | weather_data_analyzer | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_224945__339 | 0 | 0.0 | 24.5892 | 0 | [471, 350] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_224945__339.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1725 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231214_000734__127 | 0 | 0.0 | 37.866 | 0 | [155, 1005] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_000734__127.json | 25.0 | missing | missing | missing | |
| 1726 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_011526__122 | 0 | 0.0 | 7.11257 | 0 | [160, 385] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_011526__122.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1727 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_011530__442 | 0 | 0.0 | 4.83747 | 0 | [160, 260] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_011530__442.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1728 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231226_223300__858 | 0 | 0.0 | 5.39952 | 0 | [160, 290] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_223300__858.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1729 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_000656__212 | 0 | 0.0 | 10.3617 | 0 | [184, 274] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_000656__212.json | 0.0 | missing | missing | missing | |
| 1730 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_011512__319 | 0 | 0.0 | 6.82681 | 0 | [197, 356] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_011512__319.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1731 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_011518__195 | 0 | 0.0 | 6.85805 | 0 | [197, 358] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_011518__195.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1732 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_223255__535 | 0 | 0.0 | 7.71129 | 0 | [197, 402] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_223255__535.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1733 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_000645__674 | 0 | 0.0 | 28.4978 | 0 | [280, 717] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_000645__674.json | 50.0 | missing | missing | missing | |
| 1734 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_011459__251 | 0 | 0.0 | 9.18127 | 0 | [279, 310] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011459__251.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1735 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_011505__704 | 0 | 0.0 | 5.52149 | 0 | [279, 266] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_011505__704.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1736 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_223247__823 | 0 | 0.0 | 10.1047 | 0 | [279, 365] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_223247__823.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1737 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_000920__658 | 0 | 0.0 | 31.3978 | 0 | [11, 801] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_000920__658.json | 25.0 | missing | missing | missing | |
| 1738 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_011611__815 | 0 | 0.0 | 10.5591 | 0 | [447, 476] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011611__815.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1739 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_011617__442 | 0 | 0.0 | 6.67559 | 0 | [447, 283] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_011617__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1740 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_223318__266 | 0 | 0.0 | 9.38555 | 0 | [447, 417] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_223318__266.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1741 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_000848__225 | 1 | 0.0 | 54.9345 | 1 | [455, 1222] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_000848__225.json | 60.0 | missing | missing | missing | |
| 1742 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_011553__245 | 0 | 0.0 | 6.95909 | 0 | [445, 297] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_011553__245.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1743 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_011600__646 | 0 | 0.0 | 6.79605 | 0 | [445, 290] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_011600__646.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1744 | Apple-MacBook-Pro-M1 | weather_data_analyzer | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_223309__796 | 0 | 0.0 | 8.71017 | 0 | [445, 384] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_223309__796.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1745 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231213_235544__486 | 0 | 0.0 | 21.3402 | 0 | [155, 592] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__InJulia__1SHOT__20231213_235544__486.json | 50.0 | missing | missing | missing | |
| 1746 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231225_005028__204 | 0 | 0.0 | 11.7793 | 0 | [165, 360] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_005028__204.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1747 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_005040__243 | 1 | 0.0 | 11.6089 | 1 | [165, 355] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_005040__243.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1748 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231226_222355__591 | 2 | 0.0 | 16.2779 | 3 | [165, 503] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_222355__591.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1749 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_235522__672 | 0 | 0.0 | 15.8841 | 0 | [184, 430] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231213_235522__672.json | 50.0 | missing | missing | missing | |
| 1750 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_004959__930 | 1 | 0.0 | 16.7639 | 1 | [206, 509] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_004959__930.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1751 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_005016__850 | 0 | 0.0 | 17.6194 | 0 | [206, 537] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_005016__850.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1752 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_222339__489 | 3 | 0.0 | 9.94697 | 5 | [206, 295] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_222339__489.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1753 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_235507__289 | 0 | 0.0 | 15.2682 | 0 | [280, 374] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231213_235507__289.json | 25.0 | missing | missing | missing | |
| 1754 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_004929__938 | 1 | 0.0 | 21.3053 | 1 | [302, 458] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004929__938.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1755 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_004942__582 | 1 | 0.0 | 12.4731 | 1 | [302, 355] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_004942__582.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1756 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_222328__413 | 1 | 0.0 | 16.6966 | 1 | [302, 328] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_222328__413.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1757 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_235729__852 | 0 | 0.0 | 20.5105 | 0 | [11, 542] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231213_235729__852.json | 0.0 | missing | missing | missing | |
| 1758 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_005152__910 | 1 | 0.0 | 19.1591 | 1 | [473, 528] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_005152__910.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1759 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_005215__302 | 0 | 0.0 | 22.3886 | 0 | [473, 623] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_005215__302.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1760 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_222423__133 | 1 | 0.0 | 13.6871 | 1 | [473, 364] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_222423__133.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1761 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_235708__180 | 0 | 0.0 | 52.9346 | 0 | [455, 1179] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231213_235708__180.json | 0.0 | missing | missing | missing | |
| 1762 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_005119__447 | 0 | 0.0 | 16.2618 | 0 | [471, 441] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_005119__447.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1763 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_005133__347 | 1 | 0.0 | 13.5624 | 1 | [471, 360] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_005133__347.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1764 | Apple-MacBook-Pro-M1 | weather_data_analyzer | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_222409__343 | 3 | 0.0 | 13.7029 | 5 | [471, 364] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_222409__343.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1765 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231213_235828__997 | 0 | 0.0 | 18.3943 | 0 | [155, 512] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__InJulia__1SHOT__20231213_235828__997.json | 50.0 | missing | missing | missing | |
| 1766 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_005721__410 | 5 | 0.0 | 77.1516 | 5 | [159, 563] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_005721__410.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1767 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_005806__149 | 2 | 0.0 | 43.1652 | 3 | [159, 306] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_005806__149.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1768 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231226_222726__456 | 4 | 0.0 | 70.1857 | 5 | [159, 514] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_222726__456.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1769 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231213_235809__687 | 0 | 0.0 | 17.33 | 0 | [184, 469] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231213_235809__687.json | 50.0 | missing | missing | missing | |
| 1770 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_005513__720 | 2 | 0.0 | 50.4912 | 5 | [198, 354] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_005513__720.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1771 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_005604__989 | 3 | 0.0 | 50.4547 | 5 | [198, 354] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_005604__989.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1772 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_222616__576 | 4 | 0.0 | 38.8088 | 5 | [198, 267] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_222616__576.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1773 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231213_235752__412 | 0 | 0.0 | 23.3207 | 0 | [280, 587] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231213_235752__412.json | 25.0 | missing | missing | missing | |
| 1774 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_005335__727 | 0 | 0.0 | 80.1162 | 0 | [290, 373] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_005335__727.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1775 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_005423__534 | 3 | 0.0 | 47.5103 | 4 | [290, 313] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_005423__534.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1776 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_222537__371 | 0 | 0.0 | 72.9359 | 0 | [290, 337] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_222537__371.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1777 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231213_235940__176 | 0 | 0.0 | 33.8378 | 0 | [11, 857] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231213_235940__176.json | 0.0 | missing | missing | missing | |
| 1778 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_010419__472 | 0 | 0.0 | 90.1615 | 0 | [472, 587] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_010419__472.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1779 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_010516__901 | 0 | 0.0 | 56.9308 | 0 | [472, 349] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_010516__901.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1780 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_222913__255 | 0 | 0.0 | 58.4082 | 0 | [472, 362] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_222913__255.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1781 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 5 | 20231213_235906__520 | 0 | 0.0 | 15.4902 | 0 | [455, 303] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231213_235906__520.json | 0.0 | missing | missing | missing | |
| 1782 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_010143__773 | 0 | 0.0 | 76.3469 | 0 | [470, 489] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_010143__773.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1783 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_010249__535 | 0 | 0.0 | 65.9757 | 0 | [470, 415] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_010249__535.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1784 | Apple-MacBook-Pro-M1 | weather_data_analyzer | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_222814__707 | 0 | 0.0 | 48.1711 | 0 | [470, 287] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/data_analysis/weather_data_analyzer/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_222814__707.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1785 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | InJulia | 1SHOT | true | true | 3 | 20231225_021310__937 | 3 | 0.0 | 3.97743 | 2 | [83, 61] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_021310__937.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1786 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | InJulia | 1SHOT | true | false | 3 | 20231225_021323__594 | 0 | 0.0 | 13.0299 | 0 | [83, 233] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_021323__594.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1787 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | InJulia | 1SHOT | true | true | 3 | 20231225_161936__194 | 3 | 0.0 | 11.0898 | 2 | [83, 198] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_161936__194.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1788 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | InJulia | 1SHOT | true | true | 3 | 20231225_161948__237 | 0 | 0.0 | 12.2507 | 2 | [83, 219] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_161948__237.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1789 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | InJulia | 1SHOT | true | true | 3 | 20231226_230442__770 | 3 | 0.0 | 12.6976 | 2 | [83, 228] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_230442__770.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1790 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_021301__445 | 3 | 0.0 | 9.043 | 2 | [122, 153] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_021301__445.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1791 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_021306__477 | 0 | 0.0 | 4.12251 | 0 | [122, 59] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_021306__477.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1792 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_161920__784 | 3 | 0.0 | 4.30776 | 2 | [122, 62] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_161920__784.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1793 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_161924__192 | 3 | 0.0 | 4.17253 | 2 | [122, 60] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_161924__192.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1794 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_230429__479 | 3 | 0.0 | 4.3522 | 2 | [122, 64] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_230429__479.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1795 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_021248__310 | 3 | 0.0 | 15.0076 | 2 | [205, 62] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021248__310.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1796 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_021252__782 | 3 | 0.0 | 4.83215 | 2 | [205, 58] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021252__782.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1797 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_161902__548 | 0 | 0.0 | 14.4342 | 0 | [205, 56] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_161902__548.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1798 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_161916__161 | 3 | 0.0 | 13.2414 | 2 | [205, 215] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_161916__161.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1799 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_230424__576 | 3 | 0.0 | 26.7464 | 2 | [205, 293] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230424__576.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1800 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_021501__192 | 0 | 0.0 | 14.3837 | 0 | [387, 202] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021501__192.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1801 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_021524__647 | 3 | 0.0 | 22.2566 | 2 | [387, 342] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021524__647.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1802 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_162059__810 | 0 | 0.0 | 22.2164 | 0 | [387, 343] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162059__810.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1803 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162106__447 | 0 | 0.0 | 6.76183 | 0 | [387, 64] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162106__447.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1804 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_230520__509 | 0 | 0.0 | 32.5327 | 0 | [387, 522] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_230520__509.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1805 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_021415__200 | 0 | 0.0 | 26.5851 | 2 | [384, 422] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_021415__200.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1806 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_021447__720 | 0 | 0.0 | 32.0282 | 0 | [384, 516] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_021447__720.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1807 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_162016__943 | 3 | 0.0 | 16.0209 | 2 | [384, 231] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_162016__943.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1808 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_162037__630 | 3 | 0.0 | 20.8627 | 2 | [384, 322] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_162037__630.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1809 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 3 | 20231226_230448__788 | 0 | 0.0 | 5.8858 | 0 | [384, 52] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_230448__788.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1810 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | InJulia | 1SHOT | false | false | 3 | 20231214_002022__833 | 0 | 0.0 | 6.40511 | 0 | [75, 188] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_002022__833.json | 0.0 | missing | missing | missing | |
| 1811 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | InJulia | 1SHOT | false | false | 3 | 20231225_021600__530 | 0 | 0.0 | 15.3484 | 0 | [57, 283] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_021600__530.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1812 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | InJulia | 1SHOT | false | false | 3 | 20231225_021606__957 | 0 | 0.0 | 5.89997 | 0 | [57, 103] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_021606__957.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1813 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | InJulia | 1SHOT | false | false | 3 | 20231225_162127__954 | 0 | 0.0 | 0.749727 | 0 | [57, 4] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_162127__954.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1814 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | InJulia | 1SHOT | false | false | 3 | 20231225_162132__483 | 0 | 0.0 | 4.86242 | 0 | [57, 84] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_162132__483.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1815 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231214_002015__298 | 0 | 0.0 | 6.45896 | 0 | [105, 179] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_002015__298.json | 0.0 | missing | missing | missing | |
| 1816 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_021544__290 | 0 | 0.0 | 7.1357 | 0 | [59, 127] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_021544__290.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1817 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_021545__827 | 0 | 0.0 | 0.76009 | 0 | [59, 4] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_021545__827.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1818 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162124__433 | 0 | 0.0 | 5.80628 | 0 | [59, 102] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_162124__433.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1819 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162126__246 | 0 | 0.0 | 2.65232 | 0 | [59, 41] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_162126__246.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1820 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231214_002008__785 | 0 | 0.0 | 15.9813 | 2 | [187, 431] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_002008__785.json | 75.0 | missing | missing | missing | |
| 1821 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_021535__590 | 0 | 0.0 | 11.5348 | 0 | [80, 15] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021535__590.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1822 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_021537__853 | 0 | 0.0 | 1.68872 | 0 | [80, 17] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021537__853.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1823 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_162117__851 | 0 | 0.0 | 10.8547 | 0 | [80, 1] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162117__851.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1824 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_162118__559 | 0 | 0.0 | 0.865357 | 0 | [80, 1] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162118__559.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1825 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231214_002110__479 | 0 | 0.0 | 17.9257 | 2 | [11, 489] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_002110__479.json | 75.0 | missing | missing | missing | |
| 1826 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_021812__760 | 0 | 0.0 | 7.40401 | 0 | [76, 127] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021812__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1827 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162202__710 | 0 | 0.0 | 1.53571 | 0 | [76, 14] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162202__710.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1828 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162207__910 | 0 | 0.0 | 4.51848 | 0 | [76, 72] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162207__910.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1829 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 3 | 20231214_002052__886 | 0 | 0.0 | 22.995 | 0 | [376, 538] | 0.4.0 | 2 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_002052__886.json | 25.0 | missing | missing | missing | |
| 1830 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_021804__883 | 0 | 0.0 | 1.6809 | 0 | [73, 17] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_021804__883.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1831 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_162159__226 | 0 | 0.0 | 1.68157 | 0 | [73, 17] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_162159__226.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1832 | Apple-MacBook-Pro-M1 | FloatWithUnits | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_162201__157 | 0 | 0.0 | 1.51064 | 0 | [73, 14] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_162201__157.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1833 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_023255__420 | 3 | 0.0 | 28.8166 | 2 | [74, 168] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_023255__420.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1834 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_023321__429 | 3 | 0.0 | 25.8871 | 2 | [74, 150] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_023321__429.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1835 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_163646__277 | 3 | 0.0 | 43.1555 | 2 | [74, 259] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_163646__277.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1836 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_163728__515 | 3 | 0.0 | 41.9905 | 2 | [74, 252] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_163728__515.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1837 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231226_231325__771 | 3 | 0.0 | 32.8815 | 2 | [74, 194] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_231325__771.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1838 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_023131__886 | 3 | 0.0 | 42.651 | 2 | [115, 248] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_023131__886.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1839 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_023226__419 | 0 | 0.0 | 53.6538 | 2 | [115, 316] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_023226__419.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1840 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_163531__762 | 3 | 0.0 | 31.3656 | 2 | [115, 179] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_163531__762.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1841 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231225_163602__804 | 0 | 0.0 | 30.8255 | 0 | [115, 176] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_163602__804.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1842 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_231252__164 | 3 | 0.0 | 42.0434 | 2 | [115, 246] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_231252__164.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1843 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_023029__835 | 3 | 0.0 | 69.078 | 2 | [197, 222] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_023029__835.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1844 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_023048__378 | 3 | 0.0 | 19.1021 | 2 | [197, 87] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_023048__378.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1845 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_163430__212 | 0 | 0.0 | 48.2711 | 2 | [197, 105] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_163430__212.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1846 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_163459__345 | 3 | 0.0 | 28.939 | 2 | [197, 149] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_163459__345.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1847 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_231210__873 | 0 | 0.0 | 75.213 | 2 | [197, 287] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231210__873.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1848 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_023705__524 | 0 | 0.0 | 37.5656 | 2 | [403, 169] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_023705__524.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1849 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_023735__219 | 3 | 0.0 | 29.8091 | 2 | [403, 122] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_023735__219.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1850 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_164114__200 | 3 | 0.0 | 40.8368 | 2 | [403, 190] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164114__200.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1851 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_164135__404 | 3 | 0.0 | 20.9223 | 2 | [403, 68] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164135__404.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1852 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231226_231439__303 | 3 | 0.0 | 29.5377 | 2 | [403, 121] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231439__303.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1853 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_023550__704 | 3 | 0.0 | 51.1249 | 2 | [401, 250] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_023550__704.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1854 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_023627__211 | 0 | 0.0 | 35.4669 | 2 | [401, 156] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_023627__211.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1855 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_163924__161 | 0 | 0.0 | 43.12 | 2 | [401, 204] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_163924__161.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1856 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_164032__143 | 3 | 0.0 | 68.5556 | 2 | [401, 357] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164032__143.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1857 | Apple-MacBook-Pro-M1 | FloatWithUnits | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_231409__520 | 3 | 0.0 | 43.7922 | 2 | [401, 208] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_231409__520.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1858 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 3 | 20231226_232017__632 | 0 | 0.0 | 7.32564 | 0 | [76, 282] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_232017__632.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1859 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 3 | 20231227_110234__489 | 0 | 0.0 | 7.34088 | 2 | [76, 282] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_110234__489.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1860 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 3 | 20231227_110241__217 | 0 | 0.0 | 6.55351 | 0 | [76, 251] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_110241__217.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1861 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 3 | 20231227_110248__925 | 0 | 0.0 | 6.68985 | 0 | [76, 256] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_110248__925.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1862 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_232010__600 | 0 | 0.0 | 1.27291 | 0 | [113, 38] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_232010__600.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1863 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231227_110215__991 | 0 | 0.0 | 1.83226 | 0 | [113, 61] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_110215__991.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1864 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231227_110220__342 | 0 | 0.0 | 5.389 | 0 | [113, 201] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_110220__342.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1865 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231227_110227__121 | 0 | 0.0 | 6.30594 | 0 | [113, 237] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_110227__121.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1866 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231226_232008__226 | 0 | 0.0 | 8.67766 | 0 | [193, 184] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232008__226.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1867 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231227_110207__574 | 0 | 0.0 | 7.82199 | 0 | [193, 146] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110207__574.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1868 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_110210__485 | 0 | 0.0 | 2.81742 | 0 | [193, 87] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110210__485.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1869 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_110213__819 | 0 | 0.0 | 3.20243 | 0 | [193, 102] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110213__819.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1870 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_232035__963 | 0 | 0.0 | 9.02436 | 0 | [365, 293] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232035__963.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1871 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_110319__288 | 0 | 0.0 | 6.02211 | 0 | [365, 183] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110319__288.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1872 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231227_110327__589 | 0 | 0.0 | 8.33132 | 0 | [365, 268] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110327__589.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1873 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231227_110335__365 | 0 | 0.0 | 8.10514 | 0 | [365, 260] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_110335__365.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1874 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 3 | 20231226_232026__386 | 0 | 0.0 | 9.38119 | 0 | [362, 306] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_232026__386.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1875 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231227_110300__374 | 0 | 0.0 | 12.2485 | 0 | [362, 407] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_110300__374.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1876 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_110307__642 | 0 | 0.0 | 6.86567 | 0 | [362, 214] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_110307__642.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1877 | Apple-MacBook-Pro-M1 | FloatWithUnits | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231227_110313__981 | 0 | 0.0 | 5.93543 | 0 | [362, 180] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_110313__981.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1878 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | InJulia | 1SHOT | true | true | 3 | 20231225_015934__630 | 0 | 0.0 | 6.59675 | 2 | [75, 194] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__InJulia__1SHOT__20231225_015934__630.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1879 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | InJulia | 1SHOT | false | false | 3 | 20231225_015941__417 | 0 | 0.0 | 7.12235 | 0 | [1, 226] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__InJulia__1SHOT__20231225_015941__417.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1880 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | InJulia | 1SHOT | true | false | 3 | 20231225_160503__191 | 0 | 0.0 | 8.52543 | 0 | [75, 255] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__InJulia__1SHOT__20231225_160503__191.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1881 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | InJulia | 1SHOT | true | false | 3 | 20231225_160510__344 | 0 | 0.0 | 6.88798 | 0 | [1, 221] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__InJulia__1SHOT__20231225_160510__344.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1882 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | InJulia | 1SHOT | false | false | 3 | 20231226_225753__710 | 0 | 0.0 | 8.33919 | 0 | [75, 252] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__InJulia__1SHOT__20231226_225753__710.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1883 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_015923__867 | 0 | 0.0 | 9.05526 | 0 | [105, 259] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_015923__867.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1884 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_015927__913 | 0 | 0.0 | 4.80797 | 0 | [1, 153] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_015927__913.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1885 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_160449__668 | 0 | 0.0 | 6.32191 | 0 | [105, 177] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_160449__668.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1886 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231225_160455__449 | 0 | 0.0 | 5.41566 | 0 | [1, 173] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_160455__449.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1887 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_225744__743 | 0 | 0.0 | 6.66344 | 0 | [105, 190] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_225744__743.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1888 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_015905__286 | 0 | 0.0 | 13.0168 | 0 | [205, 196] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015905__286.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1889 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_015914__427 | 0 | 0.0 | 8.77096 | 0 | [1, 266] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_015914__427.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1890 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_160437__489 | 0 | 0.0 | 13.1635 | 0 | [205, 201] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160437__489.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1891 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_160443__949 | 0 | 0.0 | 6.14349 | 0 | [1, 190] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160443__949.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1892 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231226_225738__953 | 0 | 0.0 | 14.9232 | 0 | [205, 270] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225738__953.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1893 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_020048__516 | 0 | 0.0 | 13.9086 | 0 | [11, 384] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020048__516.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1894 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_020101__172 | 0 | 0.0 | 12.6602 | 0 | [1, 356] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020101__172.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1895 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_160628__754 | 0 | 0.0 | 15.4415 | 0 | [11, 426] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160628__754.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1896 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_160644__465 | 0 | 0.0 | 16.1672 | 0 | [1, 451] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160644__465.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1897 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_225837__899 | 0 | 0.0 | 15.8751 | 0 | [11, 442] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225837__899.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1898 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_020021__946 | 0 | 0.0 | 22.2384 | 0 | [376, 518] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_020021__946.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1899 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_020034__750 | 0 | 0.0 | 13.1968 | 0 | [1, 371] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_020034__750.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1900 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_160550__825 | 0 | 0.0 | 23.0734 | 0 | [376, 542] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_160550__825.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1901 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_160612__314 | 0 | 0.0 | 22.125 | 0 | [1, 603] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_160612__314.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1902 | Apple-MacBook-Pro-M1 | FloatWithUnits | llama2 | JuliaRecapTask | 1SHOT | false | false | 3 | 20231226_225821__863 | 0 | 0.0 | 28.8316 | 0 | [376, 691] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_225821__863.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1903 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | InJulia | 1SHOT | true | true | 3 | 20231225_021852__907 | 3 | 0.0 | 4.72671 | 2 | [75, 148] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__InJulia__1SHOT__20231225_021852__907.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1904 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | InJulia | 1SHOT | true | true | 3 | 20231225_021857__833 | 3 | 0.0 | 5.11609 | 2 | [75, 161] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__InJulia__1SHOT__20231225_021857__833.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1905 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | InJulia | 1SHOT | true | true | 3 | 20231225_162250__528 | 3 | 0.0 | 9.05802 | 2 | [75, 297] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__InJulia__1SHOT__20231225_162250__528.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1906 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | InJulia | 1SHOT | false | false | 3 | 20231225_162301__220 | 0 | 0.0 | 10.4038 | 0 | [75, 342] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__InJulia__1SHOT__20231225_162301__220.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1907 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | InJulia | 1SHOT | true | true | 3 | 20231226_230546__404 | 3 | 0.0 | 7.47749 | 2 | [75, 242] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__InJulia__1SHOT__20231226_230546__404.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1908 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_021840__352 | 3 | 0.0 | 5.10172 | 2 | [115, 155] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_021840__352.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1909 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_021847__828 | 0 | 0.0 | 7.18337 | 2 | [115, 225] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_021847__828.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1910 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231225_162235__822 | 0 | 0.0 | 7.1722 | 0 | [115, 228] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_162235__822.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1911 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_162241__346 | 3 | 0.0 | 6.23869 | 2 | [115, 195] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_162241__346.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1912 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_230539__758 | 3 | 0.0 | 7.01479 | 2 | [115, 221] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_230539__758.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1913 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_021823__846 | 3 | 0.0 | 10.9316 | 2 | [197, 135] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021823__846.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1914 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_021835__637 | 3 | 0.0 | 11.1291 | 2 | [197, 338] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021835__637.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1915 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_162219__132 | 3 | 0.0 | 12.2275 | 2 | [197, 180] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162219__132.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1916 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_162227__601 | 3 | 0.0 | 8.26006 | 2 | [197, 247] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162227__601.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1917 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_230531__307 | 3 | 0.0 | 11.2726 | 2 | [197, 155] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230531__307.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1918 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_021939__615 | 3 | 0.0 | 13.0368 | 2 | [379, 368] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021939__615.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1919 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_021943__484 | 3 | 0.0 | 3.58493 | 2 | [379, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021943__484.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1920 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_162337__152 | 3 | 0.0 | 6.62854 | 2 | [379, 166] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162337__152.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1921 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_162344__973 | 3 | 0.0 | 7.31549 | 2 | [379, 188] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162344__973.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1922 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_230604__310 | 0 | 0.0 | 10.5581 | 0 | [379, 291] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_230604__310.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1923 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_021918__920 | 3 | 0.0 | 9.91751 | 2 | [376, 270] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_021918__920.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1924 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_021926__470 | 3 | 0.0 | 7.6606 | 2 | [376, 198] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_021926__470.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1925 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_162323__494 | 3 | 0.0 | 9.02864 | 2 | [376, 244] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_162323__494.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1926 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_162330__274 | 0 | 0.0 | 7.22273 | 0 | [376, 185] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_162330__274.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1927 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_230554__717 | 3 | 0.0 | 7.43747 | 2 | [376, 191] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_230554__717.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1928 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 3 | 20231227_175906__967 | 3 | 0.0 | 11.6984 | 2 | [75, 225] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175906__967.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1929 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 3 | 20231227_175916__260 | 3 | 0.0 | 10.0851 | 2 | [75, 193] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175916__260.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1930 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 3 | 20231227_175925__727 | 3 | 0.0 | 8.52954 | 2 | [75, 162] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_175925__727.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1931 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231227_175830__752 | 0 | 0.0 | 13.6456 | 0 | [115, 259] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175830__752.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1932 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_175840__623 | 3 | 0.0 | 10.2506 | 2 | [115, 192] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175840__623.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1933 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_175854__369 | 3 | 0.0 | 13.7144 | 2 | [115, 261] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_175854__369.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1934 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_175753__834 | 3 | 0.0 | 15.6918 | 2 | [197, 288] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175753__834.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1935 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_175804__672 | 0 | 0.0 | 10.7005 | 0 | [197, 190] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175804__672.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1936 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_175816__131 | 3 | 0.0 | 12.5344 | 2 | [197, 226] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_175816__131.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1937 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_180021__405 | 3 | 0.0 | 13.6214 | 2 | [379, 227] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180021__405.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1938 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_180034__898 | 3 | 0.0 | 12.7945 | 2 | [379, 211] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180034__898.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1939 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_180052__117 | 3 | 0.0 | 17.3143 | 2 | [379, 298] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180052__117.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1940 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_175941__489 | 3 | 0.0 | 16.4235 | 2 | [376, 281] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175941__489.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1941 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_175958__551 | 3 | 0.0 | 17.0007 | 2 | [376, 292] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_175958__551.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1942 | Apple-MacBook-Pro-M1 | FloatWithUnits | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_180008__311 | 3 | 0.0 | 9.38031 | 2 | [376, 145] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180008__311.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1943 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 3 | 20231225_024039__568 | 0 | 0.0 | 3.19417 | 0 | [71, 71] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_024039__568.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1944 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_024044__853 | 0 | 0.0 | 4.87036 | 2 | [71, 115] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_024044__853.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1945 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231225_164517__985 | 0 | 0.0 | 2.54055 | 0 | [71, 54] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_164517__985.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1946 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_164521__549 | 0 | 0.0 | 3.75451 | 2 | [71, 86] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_164521__549.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1947 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231226_231607__845 | 0 | 0.0 | 3.88225 | 0 | [71, 89] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_231607__845.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1948 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_024032__198 | 0 | 0.0 | 2.53663 | 2 | [113, 49] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_024032__198.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1949 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231225_024035__903 | 0 | 0.0 | 3.1098 | 0 | [113, 64] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_024035__903.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1950 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164512__437 | 0 | 0.0 | 2.50448 | 2 | [113, 48] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164512__437.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1951 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164514__767 | 0 | 0.0 | 2.53264 | 2 | [113, 49] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164514__767.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1952 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_231603__454 | 0 | 0.0 | 2.95672 | 0 | [113, 60] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_231603__454.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1953 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_024025__458 | 0 | 0.0 | 10.2196 | 0 | [194, 86] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024025__458.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1954 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_024030__224 | 0 | 0.0 | 4.56892 | 0 | [194, 89] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024030__224.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1955 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_164502__759 | 0 | 0.0 | 11.0954 | 0 | [194, 115] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164502__759.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1956 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_164509__416 | 0 | 0.0 | 7.21817 | 2 | [194, 158] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164509__416.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1957 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_231600__850 | 0 | 0.0 | 13.7407 | 2 | [194, 188] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231600__850.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1958 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_024136__402 | 0 | 0.0 | 16.2947 | 0 | [380, 360] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024136__402.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1959 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_024150__155 | 0 | 0.0 | 13.9509 | 0 | [380, 302] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024150__155.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1960 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_164549__898 | 0 | 0.0 | 7.21815 | 2 | [380, 134] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164549__898.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1961 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_164555__764 | 0 | 0.0 | 6.32119 | 0 | [380, 111] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164555__764.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1962 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_231635__428 | 0 | 0.0 | 12.2722 | 0 | [380, 260] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231635__428.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1963 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_024108__778 | 0 | 0.0 | 15.188 | 0 | [378, 332] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_024108__778.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1964 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_024119__686 | 0 | 0.0 | 11.3201 | 2 | [378, 236] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_024119__686.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1965 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_164535__658 | 0 | 0.0 | 7.92794 | 2 | [378, 152] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164535__658.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1966 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_164542__104 | 0 | 0.0 | 6.99707 | 0 | [378, 128] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164542__104.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1967 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 3 | 20231226_231623__130 | 0 | 0.0 | 15.6173 | 0 | [378, 343] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_231623__130.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1968 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 3 | 20231227_231104__982 | 3 | 0.0 | 9.11696 | 2 | [70, 287] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231104__982.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1969 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 3 | 20231227_231112__815 | 3 | 0.0 | 7.56339 | 2 | [70, 236] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231112__815.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1970 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 3 | 20231227_231119__269 | 0 | 0.0 | 6.90375 | 0 | [70, 213] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231119__269.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1971 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 3 | 20231227_231128__692 | 3 | 0.0 | 9.0182 | 2 | [70, 283] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231128__692.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1972 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 3 | 20231227_231133__263 | 3 | 0.0 | 5.38113 | 2 | [70, 165] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231133__263.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1973 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231045__993 | 0 | 0.0 | 3.04765 | 2 | [112, 82] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231045__993.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1974 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231047__417 | 3 | 0.0 | 2.54302 | 2 | [112, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231047__417.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1975 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231050__927 | 3 | 0.0 | 2.53481 | 2 | [112, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231050__927.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1976 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231053__145 | 3 | 0.0 | 2.55616 | 2 | [112, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231053__145.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1977 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231055__777 | 3 | 0.0 | 2.52786 | 2 | [112, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231055__777.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1978 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_231015__987 | 0 | 0.0 | 5.65702 | 0 | [193, 126] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231015__987.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1979 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231021__282 | 0 | 0.0 | 6.15233 | 2 | [193, 169] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231021__282.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1980 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231029__319 | 0 | 0.0 | 8.55397 | 2 | [193, 247] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231029__319.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1981 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231039__858 | 0 | 0.0 | 9.32394 | 2 | [193, 271] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231039__858.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1982 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231042__715 | 0 | 0.0 | 2.8811 | 2 | [193, 62] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231042__715.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1983 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_231248__186 | 0 | 0.0 | 11.0062 | 0 | [379, 294] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231248__186.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1984 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_231258__284 | 0 | 0.0 | 10.0841 | 2 | [379, 265] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231258__284.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1985 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_231311__976 | 0 | 0.0 | 13.0121 | 2 | [379, 356] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231311__976.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1986 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_231320__924 | 0 | 0.0 | 8.14404 | 0 | [379, 205] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231320__924.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1987 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_231330__938 | 0 | 0.0 | 10.5517 | 0 | [379, 280] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231330__938.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1988 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_231145__350 | 0 | 0.0 | 11.7938 | 2 | [377, 318] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231145__350.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1989 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231155__573 | 0 | 0.0 | 10.1525 | 0 | [377, 268] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231155__573.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1990 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231213__651 | 0 | 0.0 | 17.2689 | 0 | [377, 486] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231213__651.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1991 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231224__867 | 0 | 0.0 | 11.6653 | 0 | [377, 314] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231224__867.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1992 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231237__823 | 0 | 0.0 | 12.7693 | 0 | [377, 349] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231237__823.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1993 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_231439__387 | 0 | 0.0 | 12.6618 | 0 | [70, 316] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_231439__387.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1994 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_231450__558 | 0 | 0.0 | 10.3246 | 0 | [70, 256] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_231450__558.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1995 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_231458__591 | 0 | 0.0 | 8.38527 | 0 | [70, 206] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_231458__591.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1996 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_231505__166 | 0 | 0.0 | 7.34028 | 0 | [70, 179] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_231505__166.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1997 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_231518__395 | 0 | 0.0 | 12.0791 | 0 | [70, 301] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_231518__395.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1998 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231414__297 | 3 | 0.0 | 3.16388 | 2 | [112, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_231414__297.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 1999 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231417__971 | 3 | 0.0 | 3.15585 | 2 | [112, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_231417__971.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2000 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231420__721 | 3 | 0.0 | 3.16775 | 2 | [112, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_231420__721.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2001 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231423__307 | 3 | 0.0 | 3.15599 | 2 | [112, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_231423__307.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2002 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_231427__150 | 3 | 0.0 | 3.16834 | 2 | [112, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_231427__150.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2003 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_231336__261 | 0 | 0.0 | 5.86824 | 0 | [193, 101] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231336__261.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2004 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231341__356 | 3 | 0.0 | 5.04488 | 2 | [193, 101] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231341__356.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2005 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231352__849 | 3 | 0.0 | 10.9519 | 2 | [193, 252] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231352__849.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2006 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_231407__234 | 0 | 0.0 | 15.1453 | 0 | [193, 358] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231407__234.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2007 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_231411__513 | 0 | 0.0 | 3.38626 | 2 | [193, 58] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231411__513.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2008 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_231648__298 | 0 | 0.0 | 9.38932 | 2 | [379, 187] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231648__298.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2009 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_231659__565 | 0 | 0.0 | 10.7949 | 0 | [379, 222] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231659__565.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2010 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_231712__988 | 0 | 0.0 | 13.1782 | 2 | [379, 281] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231712__988.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2011 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_231725__873 | 0 | 0.0 | 12.6866 | 0 | [379, 269] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231725__873.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2012 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_231737__626 | 0 | 0.0 | 12.8111 | 0 | [379, 272] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_231737__626.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2013 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231532__264 | 0 | 0.0 | 14.465 | 0 | [377, 313] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_231532__264.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2014 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231545__431 | 0 | 0.0 | 13.415 | 0 | [377, 287] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_231545__431.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2015 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231605__819 | 0 | 0.0 | 19.5552 | 0 | [377, 437] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_231605__819.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2016 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_231624__517 | 3 | 0.0 | 18.7995 | 2 | [377, 419] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_231624__517.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2017 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_231638__212 | 0 | 0.0 | 14.6301 | 0 | [377, 317] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_231638__212.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2018 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 3 | 20231226_121005__633 | 0 | 0.0 | 14.2311 | 0 | [70, 256] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121005__633.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2019 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 3 | 20231226_121017__843 | 0 | 0.0 | 11.9423 | 0 | [70, 214] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121017__843.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2020 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 3 | 20231226_231924__951 | 0 | 0.0 | 19.5435 | 0 | [70, 359] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_231924__951.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2021 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_120947__277 | 3 | 0.0 | 4.28954 | 2 | [112, 66] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120947__277.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2022 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_120951__990 | 3 | 0.0 | 4.24373 | 2 | [112, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_120951__990.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2023 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_231904__164 | 3 | 0.0 | 4.18562 | 2 | [112, 66] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_231904__164.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2024 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_120936__534 | 3 | 0.0 | 4.33717 | 2 | [193, 57] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120936__534.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2025 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_120942__959 | 3 | 0.0 | 6.51527 | 2 | [193, 97] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_120942__959.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2026 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231226_231900__844 | 0 | 0.0 | 21.7417 | 0 | [193, 218] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231900__844.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2027 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_121137__421 | 0 | 0.0 | 15.8534 | 0 | [379, 248] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121137__421.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2028 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_121156__395 | 0 | 0.0 | 19.5119 | 0 | [379, 312] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121156__395.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2029 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231226_232000__947 | 0 | 0.0 | 15.7192 | 2 | [379, 252] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232000__947.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2030 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 3 | 20231226_121102__195 | 0 | 0.0 | 17.7423 | 0 | [377, 280] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121102__195.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2031 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_121120__470 | 0 | 0.0 | 17.9931 | 2 | [377, 285] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121120__470.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2032 | Apple-MacBook-Pro-M1 | FloatWithUnits | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 3 | 20231226_231944__277 | 0 | 0.0 | 19.9442 | 0 | [377, 329] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_231944__277.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2033 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_110712__574 | 0 | 0.0 | 44.5699 | 0 | [78, 261] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_110712__574.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2034 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231227_110744__172 | 3 | 0.0 | 31.821 | 2 | [78, 183] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_110744__172.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2035 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_110829__966 | 0 | 0.0 | 44.8916 | 0 | [78, 263] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_110829__966.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2036 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_144830__692 | 0 | 0.0 | 53.9822 | 0 | [78, 317] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_144830__692.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2037 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231227_144902__360 | 0 | 0.0 | 31.2832 | 0 | [78, 179] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_144902__360.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2038 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_110601__700 | 3 | 0.0 | 13.857 | 2 | [117, 67] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_110601__700.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2039 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_110615__449 | 3 | 0.0 | 13.7058 | 2 | [117, 66] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_110615__449.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2040 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231227_110627__472 | 0 | 0.0 | 12.4108 | 0 | [117, 58] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_110627__472.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2041 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_144721__984 | 3 | 0.0 | 20.4395 | 2 | [117, 107] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_144721__984.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2042 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231227_144736__227 | 3 | 0.0 | 14.7458 | 2 | [117, 72] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_144736__227.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2043 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_110434__998 | 0 | 0.0 | 58.605 | 0 | [197, 294] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110434__998.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2044 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231227_110507__541 | 0 | 0.0 | 33.206 | 0 | [197, 171] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110507__541.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2045 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231227_110547__982 | 0 | 0.0 | 40.3086 | 0 | [197, 214] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_110547__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2046 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_144552__781 | 3 | 0.0 | 35.6767 | 2 | [197, 185] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_144552__781.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2047 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231227_144701__969 | 3 | 0.0 | 69.1044 | 2 | [197, 385] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_144701__969.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2048 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_111137__175 | 0 | 0.0 | 55.0954 | 0 | [391, 268] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111137__175.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2049 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_111214__138 | 0 | 0.0 | 36.7047 | 0 | [391, 159] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111214__138.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2050 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231227_111302__445 | 0 | 0.0 | 48.6615 | 0 | [391, 230] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111302__445.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2051 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_145152__115 | 3 | 0.0 | 61.1757 | 2 | [391, 302] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_145152__115.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2052 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231227_145237__811 | 3 | 0.0 | 44.3237 | 2 | [391, 203] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_145237__811.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2053 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_110900__608 | 3 | 0.0 | 31.1723 | 2 | [389, 126] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_110900__608.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2054 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231227_110943__911 | 0 | 0.0 | 43.4157 | 0 | [389, 199] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_110943__911.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2055 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_111042__420 | 0 | 0.0 | 58.2972 | 0 | [389, 287] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_111042__420.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2056 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 3 | 20231227_144946__136 | 0 | 0.0 | 44.8223 | 0 | [389, 206] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_144946__136.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2057 | Apple-MacBook-Pro-M1 | FloatWithUnits | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231227_145051__592 | 3 | 0.0 | 64.3833 | 2 | [389, 321] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_145051__592.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2058 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231225_024232__888 | 0 | 0.0 | 8.32703 | 0 | [79, 205] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_024232__888.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2059 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_024239__649 | 3 | 0.0 | 6.7152 | 2 | [79, 163] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_024239__649.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2060 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_164638__305 | 0 | 0.0 | 7.07638 | 2 | [79, 173] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_164638__305.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2061 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_164646__250 | 0 | 0.0 | 7.66105 | 2 | [79, 188] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_164646__250.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2062 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231226_231707__192 | 0 | 0.0 | 6.83955 | 0 | [79, 166] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_231707__192.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2063 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_024220__523 | 0 | 0.0 | 3.50493 | 2 | [121, 74] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_024220__523.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2064 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_024224__444 | 0 | 0.0 | 3.50741 | 2 | [121, 74] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_024224__444.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2065 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164628__524 | 3 | 0.0 | 3.15992 | 2 | [121, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164628__524.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2066 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164631__174 | 3 | 0.0 | 3.15353 | 2 | [121, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164631__174.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2067 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_231700__238 | 3 | 0.0 | 6.55295 | 2 | [121, 153] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_231700__238.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2068 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_024208__996 | 0 | 0.0 | 18.4875 | 0 | [202, 283] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024208__996.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2069 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_024217__293 | 0 | 0.0 | 8.54066 | 0 | [202, 191] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024217__293.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2070 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_164613__722 | 3 | 0.0 | 17.8829 | 2 | [202, 264] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164613__722.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2071 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_164624__878 | 0 | 0.0 | 10.77 | 2 | [202, 249] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164624__878.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2072 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_231653__746 | 3 | 0.0 | 17.9349 | 2 | [202, 279] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231653__746.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2073 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_024336__582 | 0 | 0.0 | 12.5092 | 2 | [388, 262] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024336__582.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2074 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_024348__646 | 0 | 0.0 | 12.2442 | 2 | [388, 255] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024348__646.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2075 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_164736__811 | 3 | 0.0 | 10.3924 | 2 | [388, 210] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164736__811.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2076 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_164748__278 | 0 | 0.0 | 11.5615 | 0 | [388, 239] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164748__278.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2077 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_231730__434 | 0 | 0.0 | 11.8748 | 0 | [388, 246] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231730__434.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2078 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_024305__258 | 0 | 0.0 | 9.87868 | 0 | [386, 196] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_024305__258.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2079 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_024323__968 | 0 | 0.0 | 18.3992 | 0 | [386, 407] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_024323__968.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2080 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_164715__232 | 0 | 0.0 | 11.5693 | 2 | [386, 239] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164715__232.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2081 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_164726__886 | 3 | 0.0 | 10.4577 | 2 | [386, 211] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164726__886.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2082 | Apple-MacBook-Pro-M1 | FloatWithUnits | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231226_231718__282 | 0 | 0.0 | 11.6793 | 0 | [386, 241] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_231718__282.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2083 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 3 | 20231225_020201__969 | 0 | 0.0 | 11.8559 | 0 | [77, 379] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_020201__969.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2084 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 3 | 20231225_020208__244 | 0 | 0.0 | 6.09286 | 0 | [77, 191] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_020208__244.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2085 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 3 | 20231225_160738__788 | 0 | 0.0 | 10.4223 | 0 | [77, 337] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_160738__788.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2086 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 3 | 20231225_160746__611 | 0 | 0.0 | 7.58797 | 2 | [77, 242] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_160746__611.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2087 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 3 | 20231226_225903__197 | 0 | 0.0 | 9.45056 | 2 | [77, 303] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_225903__197.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2088 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231225_020140__962 | 0 | 0.0 | 9.10963 | 0 | [119, 284] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_020140__962.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2089 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_020149__735 | 3 | 0.0 | 9.80019 | 2 | [119, 307] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_020149__735.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2090 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_160721__524 | 3 | 0.0 | 10.2945 | 2 | [119, 326] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_160721__524.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2091 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_160728__810 | 0 | 0.0 | 6.59648 | 0 | [119, 203] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_160728__810.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2092 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_225854__264 | 0 | 0.0 | 3.63372 | 2 | [119, 103] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_225854__264.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2093 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_020115__956 | 0 | 0.0 | 14.2158 | 0 | [200, 259] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020115__956.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2094 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_020130__662 | 3 | 0.0 | 14.9413 | 2 | [200, 454] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020130__662.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2095 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_160700__635 | 3 | 0.0 | 15.7963 | 2 | [200, 319] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160700__635.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2096 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_160711__521 | 0 | 0.0 | 10.4282 | 0 | [200, 313] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160711__521.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2097 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_225850__255 | 3 | 0.0 | 12.4874 | 2 | [200, 218] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225850__255.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2098 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_020245__425 | 0 | 0.0 | 2.49209 | 0 | [386, 21] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020245__425.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2099 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_020247__470 | 0 | 0.0 | 2.36847 | 0 | [386, 17] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020247__470.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2100 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_160834__630 | 3 | 0.0 | 11.3783 | 2 | [386, 307] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160834__630.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2101 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_160836__770 | 0 | 0.0 | 2.64802 | 0 | [386, 26] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160836__770.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2102 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_225916__988 | 0 | 0.0 | 4.10495 | 0 | [386, 74] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225916__988.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2103 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_020235__905 | 0 | 0.0 | 10.4227 | 2 | [384, 280] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_020235__905.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2104 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_020242__909 | 0 | 0.0 | 7.02635 | 2 | [384, 173] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_020242__909.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2105 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_160811__852 | 0 | 0.0 | 10.9544 | 2 | [384, 300] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_160811__852.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2106 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_160822__851 | 3 | 0.0 | 11.5297 | 2 | [384, 318] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_160822__851.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2107 | Apple-MacBook-Pro-M1 | FloatWithUnits | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_225912__645 | 0 | 0.0 | 8.71486 | 2 | [384, 227] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_225912__645.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2108 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | InJulia | 1SHOT | true | true | 3 | 20231225_022130__265 | 3 | 0.0 | 7.39047 | 2 | [78, 126] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__InJulia__1SHOT__20231225_022130__265.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2109 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | InJulia | 1SHOT | true | false | 3 | 20231225_022140__453 | 0 | 0.0 | 10.0571 | 0 | [78, 177] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__InJulia__1SHOT__20231225_022140__453.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2110 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | InJulia | 1SHOT | true | true | 3 | 20231225_162524__948 | 0 | 0.0 | 5.51937 | 2 | [78, 91] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__InJulia__1SHOT__20231225_162524__948.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2111 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | InJulia | 1SHOT | true | false | 3 | 20231225_162534__510 | 0 | 0.0 | 9.38136 | 0 | [78, 165] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__InJulia__1SHOT__20231225_162534__510.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2112 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | InJulia | 1SHOT | true | false | 3 | 20231226_230705__476 | 0 | 0.0 | 12.7967 | 0 | [78, 229] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__InJulia__1SHOT__20231226_230705__476.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2113 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_022118__755 | 0 | 0.0 | 4.16654 | 0 | [118, 60] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_022118__755.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2114 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_022122__839 | 0 | 0.0 | 4.17664 | 0 | [118, 60] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_022122__839.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2115 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162515__378 | 0 | 0.0 | 3.9497 | 0 | [118, 56] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_162515__378.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2116 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162519__626 | 0 | 0.0 | 4.14889 | 0 | [118, 60] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_162519__626.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2117 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_230652__964 | 0 | 0.0 | 4.16354 | 0 | [118, 60] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_230652__964.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2118 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_022109__508 | 3 | 0.0 | 30.2381 | 2 | [200, 349] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_022109__508.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2119 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_022114__114 | 0 | 0.0 | 5.18613 | 0 | [200, 64] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_022114__114.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2120 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_162504__282 | 0 | 0.0 | 20.5389 | 2 | [200, 178] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162504__282.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2121 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_162511__768 | 0 | 0.0 | 6.29566 | 0 | [200, 86] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162511__768.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2122 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231226_230648__346 | 0 | 0.0 | 15.6908 | 0 | [200, 94] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230648__346.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2123 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_022251__567 | 0 | 0.0 | 25.3848 | 0 | [382, 400] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022251__567.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2124 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_022311__414 | 0 | 0.0 | 19.6931 | 0 | [382, 300] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022311__414.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2125 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162631__151 | 0 | 0.0 | 6.74475 | 0 | [382, 68] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162631__151.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2126 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_162717__248 | 0 | 0.0 | 45.8984 | 0 | [382, 751] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162717__248.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2127 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231226_230748__726 | 0 | 0.0 | 28.7804 | 0 | [382, 460] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_230748__726.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2128 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_022210__247 | 0 | 0.0 | 14.8225 | 0 | [379, 214] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_022210__247.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2129 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_022226__523 | 0 | 0.0 | 16.2649 | 0 | [379, 240] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_022226__523.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2130 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_162606__887 | 0 | 0.0 | 13.7023 | 0 | [379, 195] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_162606__887.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2131 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_162624__593 | 0 | 0.0 | 18.2922 | 2 | [379, 278] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_162624__593.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2132 | Apple-MacBook-Pro-M1 | FloatWithUnits | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 3 | 20231226_230719__906 | 0 | 0.0 | 14.2139 | 0 | [379, 204] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_230719__906.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2133 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 3 | 20231225_024422__584 | 0 | 0.0 | 2.14733 | 0 | [70, 77] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_024422__584.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2134 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 3 | 20231225_024446__712 | 0 | 0.0 | 24.144 | 0 | [70, 895] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_024446__712.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2135 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 3 | 20231225_164834__842 | 0 | 0.0 | 1.91842 | 0 | [70, 68] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_164834__842.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2136 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 3 | 20231225_164850__589 | 0 | 0.0 | 16.0543 | 0 | [70, 612] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_164850__589.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2137 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 3 | 20231226_231758__701 | 0 | 0.0 | 1.77632 | 0 | [70, 62] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231226_231758__701.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2138 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_024419__793 | 0 | 0.0 | 22.8901 | 0 | [107, 843] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_024419__793.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2139 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_024420__330 | 0 | 0.0 | 0.432599 | 0 | [107, 4] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_024420__330.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2140 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_164815__675 | 0 | 0.0 | 19.9099 | 0 | [107, 741] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_164815__675.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2141 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_164832__840 | 0 | 0.0 | 17.168 | 0 | [107, 645] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_164832__840.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2142 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_231756__172 | 0 | 0.0 | 20.5682 | 0 | [107, 761] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_231756__172.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2143 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_024354__530 | 0 | 0.0 | 5.9611 | 0 | [187, 69] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024354__530.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2144 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_024356__878 | 0 | 0.0 | 2.17668 | 0 | [187, 65] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024356__878.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2145 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_164753__170 | 0 | 0.0 | 4.77524 | 0 | [187, 22] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164753__170.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2146 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_164755__653 | 0 | 0.0 | 2.50078 | 0 | [187, 78] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164755__653.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2147 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231226_231736__383 | 0 | 0.0 | 5.38656 | 0 | [187, 55] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231736__383.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2148 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_024526__548 | 0 | 0.0 | 7.35115 | 0 | [359, 233] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024526__548.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2149 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_024528__535 | 0 | 0.0 | 1.30283 | 0 | [359, 4] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024528__535.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2150 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_164953__847 | 0 | 0.0 | 1.33173 | 0 | [359, 5] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164953__847.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2151 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_164956__539 | 0 | 0.0 | 2.65685 | 0 | [359, 57] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164956__539.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2152 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_231838__649 | 0 | 0.0 | 9.52533 | 0 | [359, 312] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231838__649.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2153 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_024515__747 | 0 | 0.0 | 5.33047 | 0 | [356, 158] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_024515__747.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2154 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_024519__787 | 0 | 0.0 | 4.25717 | 0 | [356, 118] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_024519__787.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2155 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_164936__689 | 0 | 0.0 | 1.22583 | 0 | [356, 1] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_164936__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2156 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_164952__408 | 0 | 0.0 | 15.6023 | 0 | [356, 530] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_164952__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2157 | Apple-MacBook-Pro-M1 | FloatWithUnits | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 3 | 20231226_231829__157 | 0 | 0.0 | 30.8199 | 0 | [356, 1028] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_231829__157.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2158 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 3 | 20231225_022542__396 | 3 | 0.0 | 34.4357 | 2 | [86, 264] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_022542__396.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2159 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 3 | 20231225_022603__902 | 3 | 0.0 | 21.2343 | 2 | [86, 157] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_022603__902.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2160 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 3 | 20231225_163015__458 | 3 | 0.0 | 35.9551 | 2 | [86, 278] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_163015__458.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2161 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 3 | 20231225_163047__577 | 3 | 0.0 | 31.0651 | 2 | [86, 238] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_163047__577.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2162 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 3 | 20231226_230952__444 | 3 | 0.0 | 31.7118 | 2 | [86, 241] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_230952__444.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2163 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_022441__240 | 3 | 0.0 | 29.0499 | 2 | [126, 215] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_022441__240.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2164 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_022508__848 | 3 | 0.0 | 26.9198 | 2 | [126, 198] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_022508__848.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2165 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_162919__482 | 3 | 0.0 | 29.3778 | 2 | [126, 219] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_162919__482.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2166 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_162939__489 | 3 | 0.0 | 20.6128 | 2 | [126, 148] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_162939__489.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2167 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_230921__250 | 3 | 0.0 | 26.0644 | 2 | [126, 191] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_230921__250.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2168 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_022354__939 | 3 | 0.0 | 43.3092 | 2 | [208, 131] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_022354__939.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2169 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_022411__904 | 3 | 0.0 | 17.0455 | 2 | [208, 102] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_022411__904.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2170 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_162819__150 | 3 | 0.0 | 62.264 | 2 | [208, 291] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162819__150.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2171 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_162849__217 | 0 | 0.0 | 30.2435 | 0 | [208, 209] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162849__217.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2172 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_230854__296 | 3 | 0.0 | 66.8553 | 2 | [208, 333] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230854__296.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2173 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_022854__555 | 3 | 0.0 | 54.9123 | 2 | [390, 365] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022854__555.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2174 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_022919__446 | 3 | 0.0 | 24.9724 | 2 | [390, 132] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022919__446.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2175 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_163311__646 | 3 | 0.0 | 21.0563 | 2 | [390, 102] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_163311__646.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2176 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_163342__913 | 0 | 0.0 | 30.4332 | 0 | [390, 176] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_163342__913.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2177 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231226_231055__310 | 3 | 0.0 | 20.9444 | 2 | [390, 101] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231055__310.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2178 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_022724__796 | 0 | 0.0 | 35.3862 | 0 | [387, 214] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_022724__796.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2179 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_022759__420 | 3 | 0.0 | 35.7649 | 2 | [387, 217] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_022759__420.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2180 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_163203__373 | 0 | 0.0 | 31.2006 | 0 | [387, 181] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_163203__373.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2181 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_163250__197 | 3 | 0.0 | 46.9267 | 2 | [387, 305] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_163250__197.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2182 | Apple-MacBook-Pro-M1 | FloatWithUnits | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 3 | 20231226_231034__659 | 0 | 0.0 | 41.2017 | 0 | [387, 260] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_231034__659.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2183 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_023840__685 | 0 | 0.0 | 9.30966 | 2 | [79, 152] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_023840__685.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2184 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231225_023858__816 | 0 | 0.0 | 17.7414 | 0 | [79, 299] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_023858__816.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2185 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 3 | 20231225_164249__121 | 0 | 0.0 | 9.87758 | 2 | [79, 162] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_164249__121.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2186 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231225_164305__736 | 0 | 0.0 | 15.7425 | 0 | [79, 265] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_164305__736.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2187 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 3 | 20231226_231518__292 | 0 | 0.0 | 10.7384 | 0 | [79, 177] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_231518__292.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2188 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_023823__381 | 0 | 0.0 | 10.6394 | 0 | [121, 170] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_023823__381.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2189 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_023831__277 | 0 | 0.0 | 8.48093 | 0 | [121, 132] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_023831__277.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2190 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164230__796 | 3 | 0.0 | 14.9636 | 2 | [121, 246] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164230__796.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2191 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_164239__500 | 0 | 0.0 | 9.39463 | 2 | [121, 148] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_164239__500.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2192 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_231507__970 | 0 | 0.0 | 4.84049 | 0 | [121, 68] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_231507__970.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2193 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_023757__768 | 0 | 0.0 | 22.3564 | 0 | [202, 195] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_023757__768.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2194 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_023812__313 | 0 | 0.0 | 14.6249 | 0 | [202, 225] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_023812__313.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2195 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_164200__595 | 0 | 0.0 | 24.4762 | 0 | [202, 240] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164200__595.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2196 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_164215__145 | 0 | 0.0 | 14.9403 | 0 | [202, 230] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_164215__145.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2197 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231226_231502__690 | 0 | 0.0 | 23.2629 | 0 | [202, 226] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_231502__690.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2198 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_024001__713 | 0 | 0.0 | 15.0078 | 0 | [388, 202] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024001__713.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2199 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_024015__612 | 0 | 0.0 | 13.9952 | 0 | [388, 185] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024015__612.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2200 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_164434__886 | 0 | 0.0 | 22.1133 | 0 | [388, 322] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164434__886.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2201 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_164451__213 | 0 | 0.0 | 16.2074 | 0 | [388, 223] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_164451__213.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2202 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_231546__486 | 0 | 0.0 | 14.1104 | 0 | [388, 187] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_231546__486.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2203 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_023930__524 | 0 | 0.0 | 13.7603 | 2 | [386, 181] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_023930__524.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2204 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_023946__748 | 3 | 0.0 | 16.2422 | 2 | [386, 223] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_023946__748.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2205 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_164352__905 | 0 | 0.0 | 15.9337 | 0 | [386, 218] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164352__905.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2206 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_164412__343 | 0 | 0.0 | 20.4831 | 0 | [386, 295] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_164412__343.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2207 | Apple-MacBook-Pro-M1 | FloatWithUnits | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 3 | 20231226_231532__220 | 0 | 0.0 | 14.3949 | 0 | [386, 192] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_231532__220.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2208 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | InJulia | 1SHOT | true | false | 3 | 20231225_022003__933 | 0 | 0.0 | 6.59664 | 0 | [81, 372] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_022003__933.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2209 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | InJulia | 1SHOT | true | false | 3 | 20231225_022006__203 | 0 | 0.0 | 2.69217 | 0 | [81, 150] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_022006__203.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2210 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | InJulia | 1SHOT | true | false | 3 | 20231225_162409__914 | 0 | 0.0 | 6.22203 | 0 | [81, 354] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_162409__914.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2211 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | InJulia | 1SHOT | true | false | 3 | 20231225_162412__578 | 0 | 0.0 | 3.41342 | 0 | [81, 193] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_162412__578.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2212 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | InJulia | 1SHOT | true | false | 3 | 20231226_230620__807 | 0 | 0.0 | 6.31514 | 0 | [81, 357] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_230620__807.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2213 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_021956__324 | 0 | 0.0 | 1.56328 | 0 | [118, 77] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_021956__324.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2214 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_021957__936 | 0 | 0.0 | 1.15211 | 0 | [118, 51] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_021957__936.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2215 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162400__939 | 0 | 0.0 | 1.28942 | 0 | [118, 61] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_162400__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2216 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231225_162403__414 | 0 | 0.0 | 2.94006 | 0 | [118, 159] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_162403__414.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2217 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 3 | 20231226_230614__227 | 0 | 0.0 | 2.54152 | 0 | [118, 135] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_230614__227.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2218 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_021949__957 | 0 | 0.0 | 6.3683 | 0 | [196, 168] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021949__957.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2219 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_021954__339 | 0 | 0.0 | 4.78461 | 0 | [196, 244] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_021954__339.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2220 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_162352__230 | 0 | 0.0 | 7.15043 | 0 | [196, 232] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162352__230.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2221 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 3 | 20231225_162358__455 | 0 | 0.0 | 6.7291 | 0 | [196, 353] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_162358__455.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2222 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231226_230611__187 | 0 | 0.0 | 6.87398 | 0 | [196, 209] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230611__187.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2223 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_022034__609 | 0 | 0.0 | 8.89758 | 0 | [368, 416] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022034__609.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2224 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231225_022039__485 | 0 | 0.0 | 4.96616 | 0 | [368, 213] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_022039__485.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2225 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_162438__694 | 0 | 0.0 | 7.30046 | 0 | [368, 336] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162438__694.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2226 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_162444__455 | 0 | 0.0 | 5.0431 | 0 | [368, 219] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_162444__455.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2227 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 3 | 20231226_230632__102 | 0 | 0.0 | 5.93817 | 0 | [368, 265] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_230632__102.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2228 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_022019__559 | 0 | 0.0 | 8.04899 | 0 | [366, 373] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_022019__559.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2229 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_022025__992 | 0 | 0.0 | 5.23332 | 0 | [366, 228] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_022025__992.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2230 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_162426__819 | 0 | 0.0 | 7.0332 | 0 | [366, 324] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_162426__819.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2231 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 3 | 20231225_162431__512 | 0 | 0.0 | 5.42125 | 0 | [366, 240] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_162431__512.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2232 | Apple-MacBook-Pro-M1 | FloatWithUnits | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 3 | 20231226_230626__375 | 0 | 0.0 | 5.89345 | 0 | [366, 263] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_230626__375.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2233 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | InJulia | 1SHOT | true | true | 3 | 20231225_020334__104 | 0 | 0.0 | 4.29972 | 2 | [79, 130] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_020334__104.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2234 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | InJulia | 1SHOT | true | true | 3 | 20231225_020340__936 | 3 | 0.0 | 6.43484 | 2 | [79, 201] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_020340__936.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2235 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | InJulia | 1SHOT | true | true | 3 | 20231225_160920__411 | 0 | 0.0 | 6.7432 | 2 | [79, 214] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_160920__411.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2236 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | InJulia | 1SHOT | true | false | 3 | 20231225_160930__989 | 0 | 0.0 | 9.07642 | 0 | [79, 290] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_160930__989.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2237 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | InJulia | 1SHOT | true | true | 3 | 20231226_225942__956 | 3 | 0.0 | 6.70632 | 2 | [79, 211] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_225942__956.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2238 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_020323__338 | 3 | 0.0 | 5.40949 | 2 | [121, 162] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_020323__338.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2239 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_020329__491 | 3 | 0.0 | 6.40385 | 2 | [121, 195] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_020329__491.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2240 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_160909__210 | 0 | 0.0 | 5.9557 | 2 | [121, 181] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_160909__210.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2241 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_160914__275 | 3 | 0.0 | 4.66073 | 2 | [121, 138] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_160914__275.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2242 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_225935__944 | 3 | 0.0 | 5.12813 | 2 | [121, 153] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_225935__944.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2243 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_020304__632 | 0 | 0.0 | 16.7576 | 2 | [202, 335] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020304__632.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2244 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_020317__771 | 0 | 0.0 | 13.1862 | 0 | [202, 397] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020317__771.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2245 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_160855__215 | 0 | 0.0 | 18.2151 | 0 | [202, 391] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160855__215.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2246 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_160903__178 | 0 | 0.0 | 8.37673 | 0 | [202, 246] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160903__178.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2247 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231226_225930__374 | 0 | 0.0 | 14.179 | 0 | [202, 266] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_225930__374.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2248 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_020434__469 | 0 | 0.0 | 11.0287 | 2 | [388, 294] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020434__469.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2249 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_020450__741 | 0 | 0.0 | 16.0486 | 0 | [388, 450] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_020450__741.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2250 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_161017__782 | 3 | 0.0 | 11.0697 | 2 | [388, 298] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_161017__782.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2251 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_161025__473 | 3 | 0.0 | 8.42716 | 2 | [388, 214] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_161025__473.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2252 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231226_225959__995 | 0 | 0.0 | 7.0304 | 2 | [388, 168] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_225959__995.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2253 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_020409__911 | 0 | 0.0 | 12.7691 | 0 | [386, 349] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_020409__911.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2254 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_020423__768 | 0 | 0.0 | 14.2025 | 0 | [386, 393] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_020423__768.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2255 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_160954__621 | 0 | 0.0 | 8.58721 | 2 | [386, 219] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_160954__621.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2256 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 3 | 20231225_161006__813 | 0 | 0.0 | 11.1601 | 0 | [386, 300] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_161006__813.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2257 | Apple-MacBook-Pro-M1 | FloatWithUnits | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_225952__157 | 3 | 0.0 | 9.8437 | 2 | [386, 258] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_225952__157.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2258 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | InJulia | 1SHOT | true | true | 3 | 20231225_020704__517 | 0 | 0.0 | 37.6375 | 2 | [78, 280] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_020704__517.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2259 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | InJulia | 1SHOT | true | false | 3 | 20231225_020740__602 | 0 | 0.0 | 33.9229 | 0 | [78, 247] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_020740__602.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2260 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | InJulia | 1SHOT | true | true | 3 | 20231225_161311__672 | 3 | 0.0 | 51.4477 | 2 | [78, 389] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_161311__672.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2261 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | InJulia | 1SHOT | true | false | 3 | 20231225_161359__465 | 0 | 0.0 | 48.1692 | 0 | [78, 364] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_161359__465.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2262 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | InJulia | 1SHOT | true | false | 3 | 20231226_230151__459 | 0 | 0.0 | 57.1522 | 0 | [78, 431] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_230151__459.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2263 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 3 | 20231225_020608__770 | 0 | 0.0 | 18.2034 | 0 | [117, 123] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_020608__770.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2264 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_020626__690 | 3 | 0.0 | 17.9365 | 2 | [117, 121] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_020626__690.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2265 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_161204__199 | 3 | 0.0 | 18.4852 | 2 | [117, 126] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_161204__199.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2266 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231225_161220__638 | 3 | 0.0 | 15.1574 | 2 | [117, 100] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_161220__638.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2267 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 3 | 20231226_230054__318 | 3 | 0.0 | 15.7049 | 2 | [117, 104] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_230054__318.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2268 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_020533__123 | 3 | 0.0 | 42.7078 | 2 | [197, 109] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020533__123.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2269 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_020550__465 | 0 | 0.0 | 16.8175 | 0 | [197, 96] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_020550__465.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2270 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231225_161110__430 | 3 | 0.0 | 44.903 | 2 | [197, 132] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_161110__430.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2271 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 3 | 20231225_161146__709 | 0 | 0.0 | 35.6732 | 0 | [197, 244] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_161146__709.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2272 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 3 | 20231226_230038__139 | 3 | 0.0 | 38.8809 | 2 | [197, 94] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_230038__139.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2273 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_021155__534 | 3 | 0.0 | 45.5882 | 2 | [391, 280] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021155__534.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2274 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_021232__522 | 3 | 0.0 | 36.9779 | 2 | [391, 216] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_021232__522.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2275 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231225_161735__667 | 3 | 0.0 | 17.7326 | 2 | [391, 71] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_161735__667.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2276 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 3 | 20231225_161848__450 | 0 | 0.0 | 72.5264 | 0 | [391, 477] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_161848__450.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2277 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 3 | 20231226_230358__479 | 3 | 0.0 | 37.5351 | 2 | [391, 221] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_230358__479.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2278 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_021039__279 | 3 | 0.0 | 49.6154 | 2 | [389, 310] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_021039__279.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2279 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_021110__797 | 3 | 0.0 | 30.7391 | 2 | [389, 169] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_021110__797.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2280 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_161653__128 | 3 | 0.0 | 83.9396 | 2 | [389, 556] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_161653__128.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2281 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 3 | 20231225_161718__765 | 3 | 0.0 | 24.7817 | 2 | [389, 126] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_161718__765.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2282 | Apple-MacBook-Pro-M1 | FloatWithUnits | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 3 | 20231226_230320__474 | 3 | 0.0 | 88.8242 | 2 | [389, 597] | 0.6.0 | 2 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/FloatWithUnits/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_230320__474.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2283 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231214_003153__443 | 0 | 0.0 | 11.1914 | 0 | [79, 332] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_003153__443.json | 0.0 | missing | missing | missing | |
| 2284 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_025911__613 | 0 | 0.0 | 6.00718 | 0 | [87, 100] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_025911__613.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2285 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_025925__987 | 0 | 0.0 | 13.8236 | 0 | [87, 248] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_025925__987.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2286 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231226_232707__988 | 2 | 0.0 | 9.90014 | 3 | [87, 175] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_232707__988.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2287 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_003142__573 | 0 | 0.0 | 4.8966 | 0 | [108, 131] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_003142__573.json | 50.0 | missing | missing | missing | |
| 2288 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_025902__391 | 0 | 0.0 | 2.80473 | 0 | [125, 34] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_025902__391.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2289 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_025905__648 | 3 | 0.0 | 3.11903 | 2 | [125, 40] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_025905__648.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2290 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232657__131 | 0 | 0.0 | 2.84971 | 0 | [125, 35] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_232657__131.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2291 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_003137__836 | 0 | 0.0 | 15.2471 | 0 | [184, 412] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003137__836.json | 25.0 | missing | missing | missing | |
| 2292 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_025855__556 | 0 | 0.0 | 23.4761 | 0 | [202, 220] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_025855__556.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2293 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_025859__127 | 0 | 0.0 | 3.41157 | 0 | [202, 31] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_025859__127.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2294 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_232654__488 | 0 | 0.0 | 21.9048 | 0 | [202, 200] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232654__488.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2295 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_003231__883 | 0 | 0.0 | 14.4842 | 0 | [11, 398] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003231__883.json | 50.0 | missing | missing | missing | |
| 2296 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_030040__384 | 0 | 0.0 | 13.0418 | 0 | [390, 178] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030040__384.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2297 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_030100__356 | 0 | 0.0 | 19.7673 | 0 | [390, 298] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030100__356.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2298 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232736__878 | 5 | 0.0 | 15.1889 | 3 | [390, 217] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232736__878.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2299 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_003216__775 | 0 | 0.0 | 14.4319 | 0 | [379, 312] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_003216__775.json | 50.0 | missing | missing | missing | |
| 2300 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030008__594 | 5 | 0.0 | 14.5009 | 3 | [387, 204] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_030008__594.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2301 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030027__676 | 0 | 0.0 | 18.7219 | 0 | [387, 280] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_030027__676.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2302 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232721__459 | 5 | 0.0 | 13.8281 | 3 | [387, 193] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_232721__459.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2303 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231214_003303__220 | 0 | 0.0 | 13.963 | 0 | [79, 413] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_003303__220.json | 25.0 | missing | missing | missing | |
| 2304 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_030135__931 | 0 | 0.0 | 4.12322 | 0 | [61, 69] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_030135__931.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2305 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_030139__702 | 0 | 0.0 | 3.9206 | 0 | [61, 65] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_030139__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2306 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_003249__958 | 0 | 0.0 | 5.44791 | 0 | [108, 148] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_003249__958.json | 25.0 | missing | missing | missing | |
| 2307 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_030121__760 | 0 | 0.0 | 4.43439 | 0 | [62, 75] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_030121__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2308 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_030131__181 | 0 | 0.0 | 10.715 | 0 | [62, 195] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_030131__181.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2309 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_003244__654 | 0 | 0.0 | 12.7306 | 0 | [184, 341] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003244__654.json | 50.0 | missing | missing | missing | |
| 2310 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_030113__966 | 0 | 0.0 | 12.9126 | 0 | [77, 41] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030113__966.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2311 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_030116__990 | 0 | 0.0 | 2.87617 | 0 | [77, 40] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030116__990.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2312 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_003339__899 | 0 | 0.0 | 11.6174 | 0 | [11, 322] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003339__899.json | 0.0 | missing | missing | missing | |
| 2313 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_030159__284 | 0 | 0.0 | 0.857751 | 0 | [79, 1] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030159__284.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2314 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_030200__755 | 0 | 0.0 | 1.31922 | 0 | [79, 10] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030200__755.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2315 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_003327__699 | 0 | 0.0 | 17.7037 | 0 | [379, 399] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_003327__699.json | 0.0 | missing | missing | missing | |
| 2316 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_030145__254 | 0 | 0.0 | 1.8931 | 0 | [76, 21] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_030145__254.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2317 | Apple-MacBook-Pro-M1 | clean_column | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_030158__553 | 0 | 0.0 | 13.1411 | 0 | [76, 236] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_030158__553.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2318 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_225943__734 | 0 | 0.0 | 11.6448 | 0 | [1, 363] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_225943__734.json | 25.0 | missing | missing | missing | |
| 2319 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_225952__962 | 0 | 0.0 | 9.13887 | 0 | [1, 289] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_225952__962.json | 25.0 | missing | missing | missing | |
| 2320 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_031420__693 | 5 | 0.0 | 43.0673 | 3 | [79, 257] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_031420__693.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2321 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_031457__433 | 5 | 0.0 | 37.0307 | 3 | [79, 219] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_031457__433.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2322 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_233403__341 | 4 | 0.0 | 23.7652 | 2 | [79, 136] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_233403__341.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2323 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_225908__983 | 0 | 0.0 | 10.3914 | 0 | [1, 322] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_225908__983.json | 25.0 | missing | missing | missing | |
| 2324 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_225915__613 | 0 | 0.0 | 7.62156 | 0 | [1, 240] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_225915__613.json | 25.0 | missing | missing | missing | |
| 2325 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_031323__460 | 0 | 0.0 | 32.6024 | 0 | [120, 185] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_031323__460.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2326 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_031337__777 | 0 | 0.0 | 13.9164 | 0 | [120, 69] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_031337__777.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2327 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_233339__198 | 0 | 0.0 | 44.582 | 0 | [120, 262] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_233339__198.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2328 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_225844__424 | 0 | 0.0 | 12.0772 | 0 | [1, 362] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_225844__424.json | 25.0 | missing | missing | missing | |
| 2329 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_225852__797 | 0 | 0.0 | 8.0521 | 0 | [1, 246] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_225852__797.json | 25.0 | missing | missing | missing | |
| 2330 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_031233__282 | 5 | 0.0 | 48.5371 | 3 | [196, 93] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_031233__282.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2331 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_031250__816 | 0 | 0.0 | 17.18 | 0 | [196, 75] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_031250__816.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2332 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_233255__961 | 0 | 0.0 | 49.8479 | 0 | [196, 135] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233255__961.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2333 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_230142__408 | 0 | 0.0 | 16.7298 | 0 | [1, 464] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230142__408.json | 25.0 | missing | missing | missing | |
| 2334 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_230159__191 | 0 | 0.0 | 17.1545 | 0 | [1, 475] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230159__191.json | 25.0 | missing | missing | missing | |
| 2335 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_031754__470 | 4 | 0.0 | 38.3719 | 3 | [408, 174] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_031754__470.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2336 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_031825__421 | 5 | 0.0 | 30.825 | 3 | [408, 128] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_031825__421.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2337 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_233541__277 | 4 | 0.0 | 34.1599 | 2 | [408, 149] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233541__277.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2338 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_230053__915 | 0 | 0.0 | 18.5775 | 0 | [1, 512] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230053__915.json | 25.0 | missing | missing | missing | |
| 2339 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_230112__789 | 0 | 0.0 | 18.9801 | 0 | [1, 522] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230112__789.json | 25.0 | missing | missing | missing | |
| 2340 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_031633__659 | 4 | 0.0 | 43.7354 | 3 | [406, 206] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_031633__659.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2341 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_031715__879 | 5 | 0.0 | 41.9202 | 3 | [406, 195] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_031715__879.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2342 | Apple-MacBook-Pro-M1 | clean_column | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_233507__292 | 4 | 0.0 | 63.0082 | 3 | [406, 323] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_233507__292.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2343 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_234106__535 | 0 | 0.0 | 7.5776 | 0 | [78, 291] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231226_234106__535.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2344 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_111332__629 | 0 | 0.0 | 5.44829 | 0 | [78, 208] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_111332__629.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2345 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_111337__325 | 0 | 0.0 | 5.59057 | 0 | [78, 214] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_111337__325.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2346 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_111342__901 | 0 | 0.0 | 5.30303 | 0 | [78, 203] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_111342__901.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2347 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_234058__264 | 0 | 0.0 | 7.24833 | 0 | [115, 273] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_234058__264.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2348 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_111318__647 | 0 | 0.0 | 1.44452 | 0 | [115, 45] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_111318__647.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2349 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_111325__965 | 0 | 0.0 | 6.15728 | 0 | [115, 231] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_111325__965.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2350 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_111326__166 | 0 | 0.0 | 1.4134 | 0 | [115, 44] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_111326__166.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2351 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_234051__624 | 0 | 0.0 | 5.40831 | 0 | [189, 58] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_234051__624.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2352 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_111308__973 | 0 | 0.0 | 5.47714 | 0 | [189, 67] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111308__973.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2353 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_111312__770 | 0 | 0.0 | 4.20516 | 0 | [189, 145] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111312__770.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2354 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_111317__392 | 0 | 0.0 | 4.92551 | 0 | [189, 172] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111317__392.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2355 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_234120__945 | 0 | 0.0 | 3.68542 | 0 | [367, 95] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_234120__945.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2356 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_111406__271 | 0 | 0.0 | 3.52426 | 0 | [367, 89] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111406__271.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2357 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_111413__711 | 0 | 0.0 | 6.17745 | 0 | [367, 188] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111413__711.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2358 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_111422__359 | 0 | 0.0 | 9.24837 | 0 | [367, 301] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_111422__359.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2359 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_234116__939 | 0 | 0.0 | 10.0807 | 0 | [364, 331] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_234116__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2360 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_111350__239 | 0 | 0.0 | 7.41423 | 0 | [364, 234] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_111350__239.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2361 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_111354__406 | 0 | 0.0 | 4.15247 | 0 | [364, 113] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_111354__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2362 | Apple-MacBook-Pro-M1 | clean_column | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_111403__655 | 0 | 0.0 | 8.70999 | 0 | [364, 282] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_111403__655.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2363 | Apple-MacBook-Pro-M1 | clean_column | llama2 | InJulia | 1SHOT | false | false | 5 | 20231214_002656__193 | 0 | 0.0 | 13.9154 | 0 | [79, 412] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__InJulia__1SHOT__20231214_002656__193.json | 0.0 | missing | missing | missing | |
| 2364 | Apple-MacBook-Pro-M1 | clean_column | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_024631__437 | 0 | 0.0 | 11.677 | 0 | [79, 347] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__InJulia__1SHOT__20231225_024631__437.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2365 | Apple-MacBook-Pro-M1 | clean_column | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_024643__860 | 0 | 0.0 | 11.6512 | 0 | [1, 362] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__InJulia__1SHOT__20231225_024643__860.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2366 | Apple-MacBook-Pro-M1 | clean_column | llama2 | InJulia | 1SHOT | true | true | 5 | 20231226_232107__868 | 0 | 0.0 | 13.0693 | 0 | [79, 393] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__InJulia__1SHOT__20231226_232107__868.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2367 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_002642__295 | 0 | 0.0 | 6.9509 | 0 | [108, 195] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_002642__295.json | 0.0 | missing | missing | missing | |
| 2368 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_024613__193 | 0 | 0.0 | 9.84125 | 0 | [108, 282] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_024613__193.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2369 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_024619__557 | 0 | 0.0 | 5.85733 | 0 | [1, 184] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_024619__557.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2370 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231226_232054__497 | 0 | 0.0 | 5.00478 | 0 | [108, 137] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_232054__497.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2371 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_002635__478 | 0 | 0.0 | 14.9756 | 0 | [184, 404] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_002635__478.json | 25.0 | missing | missing | missing | |
| 2372 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_024550__950 | 0 | 0.0 | 22.3612 | 0 | [202, 461] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024550__950.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2373 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_024603__767 | 0 | 0.0 | 13.4483 | 0 | [1, 399] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024603__767.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2374 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_232049__356 | 0 | 0.0 | 13.433 | 0 | [202, 223] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232049__356.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2375 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_002740__233 | 0 | 0.0 | 18.6891 | 0 | [11, 508] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_002740__233.json | 50.0 | missing | missing | missing | |
| 2376 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_024742__666 | 0 | 0.0 | 10.0879 | 0 | [11, 281] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024742__666.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2377 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_024755__410 | 0 | 0.0 | 12.0584 | 0 | [1, 339] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024755__410.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2378 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232135__705 | 0 | 0.0 | 12.4074 | 0 | [11, 349] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232135__705.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2379 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_002721__840 | 0 | 0.0 | 18.7336 | 0 | [379, 426] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_002721__840.json | 0.0 | missing | missing | missing | |
| 2380 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_024715__310 | 0 | 0.0 | 17.1737 | 0 | [379, 385] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_024715__310.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2381 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_024732__898 | 0 | 0.0 | 17.3048 | 0 | [1, 477] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_024732__898.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2382 | Apple-MacBook-Pro-M1 | clean_column | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_232123__409 | 0 | 0.0 | 15.6868 | 0 | [379, 351] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_232123__409.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2383 | Apple-MacBook-Pro-M1 | clean_column | magicoder | InJulia | 1SHOT | true | false | 5 | 20231214_003418__323 | 0 | 0.0 | 14.2008 | 0 | [79, 420] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__InJulia__1SHOT__20231214_003418__323.json | 25.0 | missing | missing | missing | |
| 2384 | Apple-MacBook-Pro-M1 | clean_column | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_030235__823 | 3 | 0.0 | 5.59543 | 2 | [79, 178] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__InJulia__1SHOT__20231225_030235__823.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2385 | Apple-MacBook-Pro-M1 | clean_column | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_030241__339 | 5 | 0.0 | 5.70766 | 3 | [79, 182] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__InJulia__1SHOT__20231225_030241__339.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2386 | Apple-MacBook-Pro-M1 | clean_column | magicoder | InJulia | 1SHOT | true | true | 5 | 20231226_232758__810 | 4 | 0.0 | 3.51801 | 2 | [79, 107] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__InJulia__1SHOT__20231226_232758__810.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2387 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_003404__362 | 0 | 0.0 | 5.89981 | 0 | [108, 162] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_003404__362.json | 50.0 | missing | missing | missing | |
| 2388 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030225__259 | 0 | 0.0 | 7.82215 | 0 | [118, 247] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_030225__259.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2389 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030230__239 | 5 | 0.0 | 4.30858 | 3 | [118, 129] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_030230__239.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2390 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232755__843 | 4 | 0.0 | 6.0798 | 2 | [118, 189] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_232755__843.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2391 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_003358__637 | 0 | 0.0 | 19.087 | 0 | [184, 518] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003358__637.json | 25.0 | missing | missing | missing | |
| 2392 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_030210__421 | 0 | 0.0 | 10.2515 | 0 | [194, 112] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030210__421.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2393 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_030217__252 | 0 | 0.0 | 7.12968 | 0 | [194, 209] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030217__252.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2394 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_232748__345 | 0 | 0.0 | 12.2968 | 2 | [194, 189] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232748__345.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2395 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_003449__640 | 0 | 0.0 | 12.5499 | 0 | [11, 347] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003449__640.json | 50.0 | missing | missing | missing | |
| 2396 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_030313__585 | 0 | 0.0 | 8.611 | 0 | [382, 228] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030313__585.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2397 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_030322__600 | 5 | 0.0 | 8.50329 | 3 | [382, 225] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030322__600.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2398 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232813__898 | 5 | 0.0 | 7.71641 | 3 | [382, 200] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232813__898.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2399 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_003437__894 | 0 | 0.0 | 10.5505 | 0 | [379, 205] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_003437__894.json | 0.0 | missing | missing | missing | |
| 2400 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030258__162 | 5 | 0.0 | 8.82398 | 3 | [379, 234] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_030258__162.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2401 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030305__490 | 0 | 0.0 | 6.40602 | 0 | [379, 157] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_030305__490.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2402 | Apple-MacBook-Pro-M1 | clean_column | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232805__639 | 5 | 0.0 | 7.22057 | 3 | [379, 184] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_232805__639.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2403 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180147__266 | 0 | 0.0 | 20.6072 | 0 | [79, 401] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180147__266.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2404 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180150__791 | 0 | 0.0 | 3.1911 | 0 | [79, 54] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180150__791.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2405 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180158__589 | 5 | 0.0 | 7.64367 | 3 | [79, 144] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180158__589.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2406 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180117__530 | 0 | 0.0 | 4.65145 | 0 | [118, 80] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180117__530.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2407 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180120__700 | 5 | 0.0 | 3.21323 | 3 | [118, 51] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180120__700.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2408 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180126__709 | 0 | 0.0 | 6.18565 | 2 | [118, 111] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180126__709.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2409 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180100__251 | 4 | 0.0 | 7.70154 | 3 | [194, 131] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180100__251.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2410 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180109__250 | 5 | 0.0 | 9.56283 | 3 | [194, 168] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180109__250.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2411 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_180112__476 | 0 | 0.0 | 2.94795 | 0 | [194, 36] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180112__476.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2412 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180241__343 | 3 | 0.0 | 14.337 | 2 | [382, 241] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180241__343.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2413 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180251__784 | 4 | 0.0 | 9.53273 | 2 | [382, 148] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180251__784.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2414 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180258__760 | 5 | 0.0 | 7.23119 | 3 | [382, 103] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180258__760.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2415 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180208__240 | 0 | 0.0 | 10.1571 | 0 | [379, 160] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180208__240.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2416 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180219__847 | 3 | 0.0 | 10.7171 | 2 | [379, 171] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180219__847.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2417 | Apple-MacBook-Pro-M1 | clean_column | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180227__149 | 5 | 0.0 | 7.99893 | 3 | [379, 118] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180227__149.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2418 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_230638__597 | 0 | 0.0 | 13.2083 | 0 | [79, 393] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_230638__597.json | 25.0 | missing | missing | missing | |
| 2419 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_230644__531 | 0 | 0.0 | 5.69774 | 0 | [1, 183] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_230644__531.json | 25.0 | missing | missing | missing | |
| 2420 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_230656__809 | 0 | 0.0 | 11.964 | 0 | [1, 373] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_230656__809.json | 0.0 | missing | missing | missing | |
| 2421 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_233723__567 | 0 | 0.0 | 2.74645 | 0 | [75, 59] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231226_233723__567.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2422 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_230615__680 | 0 | 0.0 | 9.26727 | 0 | [108, 266] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230615__680.json | 25.0 | missing | missing | missing | |
| 2423 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_230620__238 | 0 | 0.0 | 4.95988 | 0 | [1, 158] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230620__238.json | 0.0 | missing | missing | missing | |
| 2424 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_230625__199 | 0 | 0.0 | 4.96674 | 0 | [1, 158] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230625__199.json | 25.0 | missing | missing | missing | |
| 2425 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_233720__557 | 0 | 0.0 | 8.8399 | 0 | [116, 213] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_233720__557.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2426 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_230540__998 | 0 | 0.0 | 14.5892 | 0 | [184, 395] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230540__998.json | 25.0 | missing | missing | missing | |
| 2427 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_230556__172 | 0 | 0.0 | 15.6004 | 0 | [1, 460] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230556__172.json | 25.0 | missing | missing | missing | |
| 2428 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_230605__994 | 0 | 0.0 | 9.68826 | 0 | [1, 294] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230605__994.json | 0.0 | missing | missing | missing | |
| 2429 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_233711__874 | 0 | 0.0 | 9.21638 | 0 | [192, 75] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233711__874.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2430 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_230818__847 | 0 | 0.0 | 20.0098 | 0 | [11, 542] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230818__847.json | 25.0 | missing | missing | missing | |
| 2431 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_230834__676 | 0 | 0.0 | 15.8856 | 0 | [1, 442] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230834__676.json | 25.0 | missing | missing | missing | |
| 2432 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_230851__713 | 0 | 0.0 | 16.5733 | 0 | [1, 460] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230851__713.json | 25.0 | missing | missing | missing | |
| 2433 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_233756__166 | 0 | 0.0 | 18.8073 | 0 | [383, 421] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233756__166.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2434 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_230730__644 | 0 | 0.0 | 17.5038 | 0 | [379, 395] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230730__644.json | 25.0 | missing | missing | missing | |
| 2435 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_230747__478 | 0 | 0.0 | 17.1873 | 0 | [1, 476] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230747__478.json | 25.0 | missing | missing | missing | |
| 2436 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_230758__568 | 0 | 0.0 | 11.0129 | 0 | [1, 313] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230758__568.json | 25.0 | missing | missing | missing | |
| 2437 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_233737__246 | 0 | 0.0 | 14.5761 | 0 | [381, 317] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_233737__246.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2438 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_231840__227 | 5 | 0.0 | 7.03798 | 3 | [74, 219] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231840__227.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2439 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_231847__484 | 0 | 0.0 | 7.16018 | 0 | [74, 223] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231847__484.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2440 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_231854__195 | 0 | 0.0 | 6.88232 | 3 | [74, 214] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231854__195.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2441 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_231903__887 | 0 | 0.0 | 8.85004 | 0 | [74, 278] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231903__887.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2442 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_231914__776 | 0 | 0.0 | 11.4097 | 0 | [74, 361] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_231914__776.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2443 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_231825__828 | 5 | 0.0 | 2.13424 | 3 | [115, 51] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231825__828.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2444 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_231827__426 | 5 | 0.0 | 1.93383 | 3 | [115, 45] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231827__426.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2445 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_231829__509 | 5 | 0.0 | 1.95376 | 3 | [115, 45] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231829__509.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2446 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_231831__451 | 5 | 0.0 | 2.02685 | 3 | [115, 48] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231831__451.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2447 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_231833__724 | 0 | 0.0 | 1.90211 | 0 | [115, 44] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_231833__724.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2448 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_231750__982 | 0 | 0.0 | 12.2811 | 0 | [191, 344] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231750__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2449 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_231759__170 | 0 | 0.0 | 8.89832 | 3 | [191, 262] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231759__170.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2450 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_231806__355 | 5 | 0.0 | 7.66764 | 3 | [191, 222] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231806__355.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2451 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_231810__852 | 5 | 0.0 | 3.72964 | 3 | [191, 94] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231810__852.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2452 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_231823__218 | 0 | 0.0 | 12.3573 | 3 | [191, 372] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_231823__218.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2453 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232017__175 | 0 | 0.0 | 12.7513 | 0 | [382, 348] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232017__175.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2454 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232030__877 | 5 | 0.0 | 12.9383 | 3 | [382, 353] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232030__877.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2455 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232041__473 | 4 | 0.0 | 10.1345 | 3 | [382, 267] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232041__473.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2456 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232051__464 | 5 | 0.0 | 10.0372 | 3 | [382, 264] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232051__464.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2457 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232102__110 | 0 | 0.0 | 10.9077 | 0 | [382, 291] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232102__110.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2458 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_231925__779 | 0 | 0.0 | 10.2035 | 0 | [380, 269] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231925__779.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2459 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_231935__600 | 5 | 0.0 | 10.1884 | 3 | [380, 269] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231935__600.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2460 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_231946__281 | 5 | 0.0 | 11.3165 | 3 | [380, 304] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231946__281.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2461 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_231954__678 | 5 | 0.0 | 7.16017 | 3 | [380, 174] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_231954__678.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2462 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232005__316 | 5 | 0.0 | 10.8838 | 3 | [380, 291] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_232005__316.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2463 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_232215__524 | 0 | 0.0 | 8.23284 | 0 | [74, 202] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_232215__524.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2464 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_232226__809 | 5 | 0.0 | 11.6467 | 3 | [74, 290] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_232226__809.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2465 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_232235__409 | 0 | 0.0 | 8.97181 | 0 | [74, 221] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_232235__409.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2466 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_232246__219 | 0 | 0.0 | 10.0905 | 0 | [74, 250] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_232246__219.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2467 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_232255__319 | 5 | 0.0 | 8.93829 | 3 | [74, 220] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_232255__319.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2468 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_232146__348 | 5 | 0.0 | 3.02144 | 3 | [115, 61] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_232146__348.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2469 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_232152__990 | 5 | 0.0 | 5.62216 | 3 | [115, 129] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_232152__990.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2470 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_232157__319 | 0 | 0.0 | 5.5898 | 0 | [115, 128] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_232157__319.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2471 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_232201__366 | 5 | 0.0 | 3.47311 | 3 | [115, 73] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_232201__366.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2472 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_232206__232 | 0 | 0.0 | 5.6987 | 0 | [115, 131] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_232206__232.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2473 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_232108__428 | 4 | 0.0 | 6.00559 | 2 | [191, 108] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232108__428.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2474 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_232112__949 | 2 | 0.0 | 4.12843 | 3 | [191, 81] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232112__949.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2475 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_232122__130 | 0 | 0.0 | 10.4541 | 0 | [191, 243] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232122__130.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2476 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_232130__682 | 0 | 0.0 | 7.52332 | 0 | [191, 168] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232130__682.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2477 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_232143__539 | 0 | 0.0 | 12.7672 | 0 | [191, 302] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232143__539.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2478 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_232412__869 | 0 | 0.0 | 15.281 | 0 | [382, 333] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232412__869.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2479 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232427__824 | 0 | 0.0 | 14.9574 | 3 | [382, 325] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232427__824.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2480 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232442__809 | 0 | 0.0 | 14.6017 | 3 | [382, 316] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232442__809.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2481 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_232455__192 | 0 | 0.0 | 13.3863 | 0 | [382, 286] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232455__192.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2482 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_232511__446 | 2 | 0.0 | 15.7356 | 3 | [382, 344] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_232511__446.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2483 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232303__851 | 0 | 0.0 | 8.67932 | 0 | [380, 169] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_232303__851.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2484 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232317__409 | 0 | 0.0 | 13.453 | 0 | [380, 288] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_232317__409.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2485 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232332__399 | 0 | 0.0 | 15.1223 | 0 | [380, 329] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_232332__399.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2486 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232348__329 | 0 | 0.0 | 15.7704 | 0 | [380, 345] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_232348__329.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2487 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232357__222 | 5 | 0.0 | 9.0444 | 3 | [380, 178] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_232357__222.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2488 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_121236__331 | 0 | 0.0 | 12.4128 | 0 | [74, 223] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121236__331.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2489 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_121250__352 | 5 | 0.0 | 13.9767 | 3 | [74, 251] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121250__352.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2490 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_234006__823 | 5 | 0.0 | 10.0083 | 3 | [74, 180] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_234006__823.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2491 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_121220__353 | 0 | 0.0 | 3.30762 | 0 | [115, 48] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_121220__353.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2492 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_121224__747 | 5 | 0.0 | 3.38196 | 3 | [115, 50] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_121224__747.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2493 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_233956__467 | 5 | 0.0 | 3.13946 | 3 | [115, 46] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_233956__467.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2494 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_121211__300 | 0 | 0.0 | 15.1551 | 0 | [191, 261] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_121211__300.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2495 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_121217__570 | 5 | 0.0 | 5.71459 | 3 | [191, 83] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_121217__570.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2496 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_233953__846 | 0 | 0.0 | 18.9278 | 0 | [191, 168] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233953__846.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2497 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_121417__712 | 5 | 0.0 | 21.4943 | 3 | [382, 353] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121417__712.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2498 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_121438__409 | 4 | 0.0 | 20.8479 | 3 | [382, 346] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121438__409.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2499 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_234046__624 | 5 | 0.0 | 19.7953 | 3 | [382, 326] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_234046__624.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2500 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_121340__510 | 5 | 0.0 | 18.3095 | 3 | [380, 296] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121340__510.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2501 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_121355__819 | 5 | 0.0 | 14.8983 | 3 | [380, 230] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121355__819.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2502 | Apple-MacBook-Pro-M1 | clean_column | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_234026__169 | 5 | 0.0 | 19.6768 | 3 | [380, 324] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_234026__169.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2503 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_111808__914 | 3 | 0.0 | 26.2935 | 2 | [78, 149] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_111808__914.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2504 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_111839__784 | 5 | 0.0 | 30.8272 | 3 | [78, 177] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_111839__784.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2505 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_111904__485 | 4 | 0.0 | 24.9915 | 3 | [78, 141] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_111904__485.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2506 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_145555__279 | 0 | 0.0 | 26.8863 | 0 | [78, 152] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_145555__279.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2507 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_145617__434 | 4 | 0.0 | 21.8552 | 2 | [78, 121] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_145617__434.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2508 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_111712__195 | 5 | 0.0 | 42.2076 | 3 | [117, 241] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_111712__195.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2509 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_111733__554 | 0 | 0.0 | 21.4569 | 0 | [117, 114] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_111733__554.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2510 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_111742__354 | 0 | 0.0 | 8.8638 | 0 | [117, 36] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_111742__354.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2511 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_145441__609 | 0 | 0.0 | 35.4635 | 3 | [117, 199] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_145441__609.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2512 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_145528__981 | 5 | 0.0 | 47.6715 | 3 | [117, 273] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_145528__981.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2513 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_111502__890 | 5 | 0.0 | 39.9599 | 3 | [192, 191] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111502__890.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2514 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_111546__606 | 4 | 0.0 | 44.202 | 2 | [192, 242] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111546__606.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2515 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_111629__602 | 0 | 0.0 | 43.0337 | 0 | [192, 235] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_111629__602.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2516 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_145329__164 | 4 | 0.0 | 52.393 | 3 | [192, 290] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_145329__164.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2517 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_145405__830 | 0 | 0.0 | 35.9153 | 0 | [192, 191] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_145405__830.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2518 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_112219__355 | 3 | 0.0 | 38.5577 | 3 | [391, 170] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112219__355.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2519 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_112319__830 | 5 | 0.0 | 60.1815 | 3 | [391, 298] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112319__830.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2520 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_112330__366 | 0 | 0.0 | 10.9227 | 0 | [391, 4] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112330__366.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2521 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_145819__730 | 0 | 0.0 | 11.0575 | 0 | [391, 4] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_145819__730.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2522 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_145830__312 | 0 | 0.0 | 11.2011 | 0 | [391, 5] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_145830__312.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2523 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_111948__381 | 1 | 0.0 | 44.0952 | 3 | [389, 203] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_111948__381.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2524 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_112048__184 | 5 | 0.0 | 59.6211 | 3 | [389, 295] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_112048__184.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2525 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_112140__250 | 5 | 0.0 | 52.1442 | 3 | [389, 251] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_112140__250.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2526 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_145710__196 | 5 | 0.0 | 52.2777 | 3 | [389, 250] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_145710__196.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2527 | Apple-MacBook-Pro-M1 | clean_column | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_145808__426 | 0 | 0.0 | 58.4041 | 3 | [389, 286] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_145808__426.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2528 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_231006__469 | 0 | 0.0 | 11.1476 | 0 | [79, 332] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_231006__469.json | 0.0 | missing | missing | missing | |
| 2529 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_231018__222 | 0 | 0.0 | 11.5963 | 0 | [1, 362] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_231018__222.json | 25.0 | missing | missing | missing | |
| 2530 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_231032__436 | 0 | 0.0 | 14.2971 | 0 | [1, 440] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_231032__436.json | 25.0 | missing | missing | missing | |
| 2531 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_233821__432 | 4 | 0.0 | 7.89372 | 2 | [83, 193] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231226_233821__432.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2532 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_230940__699 | 0 | 0.0 | 5.81375 | 0 | [108, 161] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230940__699.json | 0.0 | missing | missing | missing | |
| 2533 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_230949__170 | 0 | 0.0 | 8.27826 | 0 | [1, 259] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230949__170.json | 25.0 | missing | missing | missing | |
| 2534 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_230955__210 | 0 | 0.0 | 6.21679 | 0 | [1, 197] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230955__210.json | 25.0 | missing | missing | missing | |
| 2535 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_233813__965 | 0 | 0.0 | 3.24242 | 0 | [124, 67] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_233813__965.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2536 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_230906__477 | 0 | 0.0 | 15.779 | 0 | [184, 428] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230906__477.json | 25.0 | missing | missing | missing | |
| 2537 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_230925__665 | 0 | 0.0 | 18.1869 | 0 | [1, 530] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230925__665.json | 25.0 | missing | missing | missing | |
| 2538 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_230934__714 | 0 | 0.0 | 9.86972 | 0 | [1, 299] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230934__714.json | 25.0 | missing | missing | missing | |
| 2539 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_233810__931 | 0 | 0.0 | 13.6231 | 0 | [200, 169] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233810__931.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2540 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_231158__313 | 0 | 0.0 | 20.711 | 0 | [11, 560] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231158__313.json | 25.0 | missing | missing | missing | |
| 2541 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_231207__356 | 0 | 0.0 | 9.01409 | 0 | [1, 258] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231207__356.json | 25.0 | missing | missing | missing | |
| 2542 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_231220__190 | 0 | 0.0 | 12.775 | 0 | [1, 360] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231220__190.json | 0.0 | missing | missing | missing | |
| 2543 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_233846__194 | 0 | 0.0 | 11.8528 | 0 | [391, 245] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233846__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2544 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_231105__542 | 0 | 0.0 | 15.5912 | 0 | [379, 344] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_231105__542.json | 25.0 | missing | missing | missing | |
| 2545 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_231124__943 | 0 | 0.0 | 18.7382 | 0 | [1, 516] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_231124__943.json | 25.0 | missing | missing | missing | |
| 2546 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_231138__356 | 0 | 0.0 | 13.7783 | 0 | [1, 387] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_231138__356.json | 25.0 | missing | missing | missing | |
| 2547 | Apple-MacBook-Pro-M1 | clean_column | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_233834__987 | 0 | 0.0 | 12.6591 | 0 | [389, 265] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_233834__987.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2548 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231214_002811__939 | 0 | 0.0 | 10.5197 | 0 | [79, 312] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_002811__939.json | 50.0 | missing | missing | missing | |
| 2549 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 5 | 20231225_024825__632 | 0 | 0.0 | 1.84122 | 0 | [81, 47] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_024825__632.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2550 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_024831__990 | 0 | 0.0 | 5.55964 | 0 | [81, 173] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_024831__990.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2551 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231226_232153__280 | 5 | 0.0 | 2.36119 | 3 | [81, 65] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_232153__280.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2552 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_002800__647 | 0 | 0.0 | 5.63532 | 0 | [108, 154] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_002800__647.json | 50.0 | missing | missing | missing | |
| 2553 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_024820__739 | 0 | 0.0 | 5.76361 | 0 | [122, 174] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_024820__739.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2554 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_024823__466 | 0 | 0.0 | 3.45226 | 0 | [122, 97] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_024823__466.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2555 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232151__213 | 0 | 0.0 | 2.51841 | 0 | [122, 65] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_232151__213.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2556 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_002754__199 | 0 | 0.0 | 14.3729 | 0 | [184, 387] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_002754__199.json | 25.0 | missing | missing | missing | |
| 2557 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_024807__475 | 5 | 0.0 | 11.9194 | 3 | [198, 184] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024807__475.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2558 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_024814__140 | 0 | 0.0 | 7.2822 | 0 | [198, 209] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024814__140.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2559 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_232148__343 | 4 | 0.0 | 13.1667 | 2 | [198, 239] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232148__343.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2560 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_002854__413 | 0 | 0.0 | 17.6585 | 0 | [11, 482] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_002854__413.json | 50.0 | missing | missing | missing | |
| 2561 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_024915__243 | 0 | 0.0 | 7.48489 | 0 | [389, 182] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024915__243.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2562 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_024922__291 | 5 | 0.0 | 7.65594 | 3 | [389, 188] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_024922__291.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2563 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_232207__958 | 0 | 0.0 | 3.99228 | 0 | [389, 70] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232207__958.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2564 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_002837__745 | 0 | 0.0 | 19.7793 | 0 | [379, 453] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_002837__745.json | 50.0 | missing | missing | missing | |
| 2565 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_024852__190 | 5 | 0.0 | 9.98389 | 3 | [387, 262] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_024852__190.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2566 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_024907__166 | 0 | 0.0 | 14.6256 | 3 | [387, 406] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_024907__166.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2567 | Apple-MacBook-Pro-M1 | clean_column | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232203__523 | 5 | 0.0 | 9.03081 | 3 | [387, 232] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_232203__523.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2568 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231214_003637__787 | 0 | 0.0 | 13.598 | 0 | [79, 403] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__InJulia__1SHOT__20231214_003637__787.json | 25.0 | missing | missing | missing | |
| 2569 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_030513__271 | 0 | 0.0 | 8.10921 | 0 | [82, 139] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__InJulia__1SHOT__20231225_030513__271.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2570 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_030519__543 | 0 | 0.0 | 5.92723 | 0 | [82, 98] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__InJulia__1SHOT__20231225_030519__543.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2571 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231226_232909__978 | 0 | 0.0 | 8.1473 | 0 | [82, 141] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__InJulia__1SHOT__20231226_232909__978.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2572 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_003623__658 | 0 | 0.0 | 6.38548 | 0 | [108, 177] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_003623__658.json | 50.0 | missing | missing | missing | |
| 2573 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_030500__372 | 0 | 0.0 | 5.10159 | 0 | [121, 78] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_030500__372.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2574 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_030504__294 | 0 | 0.0 | 4.8068 | 0 | [121, 72] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_030504__294.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2575 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_232900__598 | 0 | 0.0 | 4.65808 | 0 | [121, 69] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_232900__598.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2576 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_003617__309 | 0 | 0.0 | 9.70911 | 0 | [184, 254] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003617__309.json | 50.0 | missing | missing | missing | |
| 2577 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_030439__342 | 0 | 0.0 | 24.681 | 0 | [197, 249] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030439__342.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2578 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_030454__763 | 0 | 0.0 | 15.1604 | 0 | [197, 250] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030454__763.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2579 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_232856__632 | 0 | 0.0 | 23.656 | 0 | [197, 243] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232856__632.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2580 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_003714__301 | 0 | 0.0 | 11.3702 | 0 | [11, 316] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003714__301.json | 50.0 | missing | missing | missing | |
| 2581 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_030642__307 | 0 | 0.0 | 7.67872 | 0 | [385, 80] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030642__307.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2582 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_030702__452 | 0 | 0.0 | 19.8975 | 0 | [385, 299] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030702__452.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2583 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232952__454 | 0 | 0.0 | 25.4567 | 0 | [385, 398] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232952__454.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2584 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_003702__811 | 0 | 0.0 | 18.4149 | 0 | [379, 418] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_003702__811.json | 50.0 | missing | missing | missing | |
| 2585 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_030614__629 | 0 | 0.0 | 38.1891 | 0 | [382, 617] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_030614__629.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2586 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030634__451 | 0 | 0.0 | 19.8591 | 0 | [382, 303] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_030634__451.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2587 | Apple-MacBook-Pro-M1 | clean_column | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232927__155 | 0 | 0.0 | 18.0361 | 0 | [382, 272] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_232927__155.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2588 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_231330__537 | 0 | 0.0 | 10.3275 | 0 | [79, 308] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_231330__537.json | 25.0 | missing | missing | missing | |
| 2589 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_231341__964 | 0 | 0.0 | 10.9216 | 0 | [1, 342] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_231341__964.json | 25.0 | missing | missing | missing | |
| 2590 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_231352__567 | 0 | 0.0 | 11.0346 | 0 | [1, 345] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_231352__567.json | 25.0 | missing | missing | missing | |
| 2591 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_233923__720 | 0 | 0.0 | 13.2979 | 0 | [72, 508] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231226_233923__720.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2592 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_231306__356 | 0 | 0.0 | 5.60134 | 0 | [108, 154] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_231306__356.json | 25.0 | missing | missing | missing | |
| 2593 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_231312__182 | 0 | 0.0 | 5.79347 | 0 | [1, 184] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_231312__182.json | 25.0 | missing | missing | missing | |
| 2594 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_231319__169 | 0 | 0.0 | 7.76262 | 0 | [1, 244] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_231319__169.json | 25.0 | missing | missing | missing | |
| 2595 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_233909__939 | 0 | 0.0 | 18.1033 | 0 | [109, 674] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_233909__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2596 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_231235__422 | 0 | 0.0 | 14.5896 | 0 | [184, 395] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_231235__422.json | 25.0 | missing | missing | missing | |
| 2597 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_231246__574 | 0 | 0.0 | 11.0488 | 0 | [1, 333] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_231246__574.json | 0.0 | missing | missing | missing | |
| 2598 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_231300__566 | 0 | 0.0 | 14.3269 | 0 | [1, 425] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_231300__566.json | 25.0 | missing | missing | missing | |
| 2599 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_233851__651 | 0 | 0.0 | 5.23812 | 0 | [183, 49] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233851__651.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2600 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_231522__349 | 0 | 0.0 | 23.6075 | 0 | [11, 632] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231522__349.json | 25.0 | missing | missing | missing | |
| 2601 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_231542__423 | 0 | 0.0 | 19.2379 | 0 | [1, 528] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231542__423.json | 0.0 | missing | missing | missing | |
| 2602 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_231553__595 | 0 | 0.0 | 11.4661 | 0 | [1, 325] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_231553__595.json | 25.0 | missing | missing | missing | |
| 2603 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_233934__924 | 0 | 0.0 | 6.29127 | 0 | [361, 193] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233934__924.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2604 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_231426__291 | 0 | 0.0 | 15.8491 | 0 | [379, 351] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_231426__291.json | 0.0 | missing | missing | missing | |
| 2605 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_231443__121 | 0 | 0.0 | 16.9624 | 0 | [1, 470] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_231443__121.json | 25.0 | missing | missing | missing | |
| 2606 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_231459__487 | 0 | 0.0 | 15.5666 | 0 | [1, 434] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_231459__487.json | 25.0 | missing | missing | missing | |
| 2607 | Apple-MacBook-Pro-M1 | clean_column | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_233927__409 | 0 | 0.0 | 4.92657 | 0 | [358, 142] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_233927__409.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2608 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231214_003741__348 | 0 | 0.0 | 9.23832 | 0 | [79, 274] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_003741__348.json | 0.0 | missing | missing | missing | |
| 2609 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_030900__131 | 4 | 0.0 | 25.6567 | 3 | [90, 193] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_030900__131.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2610 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_030926__722 | 3 | 0.0 | 25.1571 | 2 | [90, 189] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_030926__722.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2611 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231226_233110__685 | 3 | 0.0 | 20.9688 | 2 | [90, 155] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_233110__685.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2612 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_003732__450 | 0 | 0.0 | 7.68799 | 0 | [108, 217] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_003732__450.json | 0.0 | missing | missing | missing | |
| 2613 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030811__967 | 4 | 0.0 | 11.2634 | 2 | [129, 66] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_030811__967.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2614 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030835__854 | 3 | 0.0 | 23.9729 | 2 | [129, 169] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_030835__854.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2615 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_233049__599 | 4 | 0.0 | 14.0142 | 3 | [129, 89] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_233049__599.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2616 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_003724__331 | 0 | 0.0 | 10.2001 | 0 | [184, 268] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003724__331.json | 0.0 | missing | missing | missing | |
| 2617 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_030737__820 | 0 | 0.0 | 35.1258 | 0 | [205, 65] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030737__820.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2618 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_030759__107 | 3 | 0.0 | 22.4025 | 2 | [205, 145] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030759__107.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2619 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_233035__210 | 4 | 0.0 | 42.9161 | 2 | [205, 142] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233035__210.json | 86.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2620 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_003821__681 | 0 | 0.0 | 18.4429 | 0 | [11, 501] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003821__681.json | 50.0 | missing | missing | missing | |
| 2621 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_031118__815 | 4 | 0.0 | 28.1766 | 3 | [393, 156] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_031118__815.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2622 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_031144__173 | 5 | 0.0 | 26.0046 | 3 | [393, 139] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_031144__173.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2623 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_233205__706 | 4 | 0.0 | 34.5579 | 3 | [393, 207] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233205__706.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2624 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_003802__180 | 0 | 0.0 | 15.9389 | 0 | [379, 352] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_003802__180.json | 25.0 | missing | missing | missing | |
| 2625 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_031033__841 | 4 | 0.0 | 36.4428 | 3 | [390, 222] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_031033__841.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2626 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_031050__671 | 5 | 0.0 | 16.7406 | 3 | [390, 67] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_031050__671.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2627 | Apple-MacBook-Pro-M1 | clean_column | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_233130__191 | 5 | 0.0 | 19.9364 | 3 | [390, 93] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_233130__191.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2628 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_230300__278 | 0 | 0.0 | 13.1957 | 0 | [79, 393] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_230300__278.json | 25.0 | missing | missing | missing | |
| 2629 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_230309__551 | 0 | 0.0 | 8.84277 | 0 | [1, 280] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_230309__551.json | 0.0 | missing | missing | missing | |
| 2630 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_230314__755 | 0 | 0.0 | 4.9831 | 0 | [1, 161] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_230314__755.json | 25.0 | missing | missing | missing | |
| 2631 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231226_233630__536 | 0 | 0.0 | 12.1326 | 0 | [83, 201] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231226_233630__536.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2632 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_230236__890 | 0 | 0.0 | 6.94868 | 0 | [108, 196] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230236__890.json | 25.0 | missing | missing | missing | |
| 2633 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_230242__723 | 0 | 0.0 | 5.15585 | 0 | [1, 164] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230242__723.json | 25.0 | missing | missing | missing | |
| 2634 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_230247__387 | 0 | 0.0 | 5.50709 | 0 | [1, 175] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_230247__387.json | 25.0 | missing | missing | missing | |
| 2635 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_233618__651 | 0 | 0.0 | 11.6866 | 0 | [124, 188] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231226_233618__651.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2636 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_230207__354 | 0 | 0.0 | 7.99854 | 0 | [184, 205] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230207__354.json | 0.0 | missing | missing | missing | |
| 2637 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_230219__564 | 0 | 0.0 | 11.7534 | 0 | [1, 353] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230219__564.json | 0.0 | missing | missing | missing | |
| 2638 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_230229__895 | 0 | 0.0 | 10.7059 | 0 | [1, 323] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_230229__895.json | 25.0 | missing | missing | missing | |
| 2639 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_031849__866 | 0 | 0.0 | 23.2862 | 0 | [200, 214] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_031849__866.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2640 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_233606__412 | 0 | 0.0 | 25.2807 | 0 | [200, 257] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231226_233606__412.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2641 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_230450__188 | 0 | 0.0 | 22.8973 | 0 | [11, 615] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230450__188.json | 25.0 | missing | missing | missing | |
| 2642 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_230504__471 | 0 | 0.0 | 14.84 | 0 | [1, 415] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230504__471.json | 25.0 | missing | missing | missing | |
| 2643 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_230525__192 | 0 | 0.0 | 20.9784 | 0 | [1, 572] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_230525__192.json | 25.0 | missing | missing | missing | |
| 2644 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_233702__178 | 0 | 0.0 | 14.3674 | 0 | [391, 191] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231226_233702__178.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2645 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_230357__500 | 0 | 0.0 | 19.0357 | 0 | [379, 435] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230357__500.json | 25.0 | missing | missing | missing | |
| 2646 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_230416__994 | 0 | 0.0 | 19.4844 | 0 | [1, 535] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230416__994.json | 25.0 | missing | missing | missing | |
| 2647 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_230427__748 | 0 | 0.0 | 10.6106 | 0 | [1, 302] | 0.5.0-DEV | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_230427__748.json | 25.0 | missing | missing | missing | |
| 2648 | Apple-MacBook-Pro-M1 | clean_column | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_233647__382 | 0 | 0.0 | 16.722 | 0 | [389, 231] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231226_233647__382.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2649 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231214_003527__690 | 0 | 0.0 | 12.4482 | 0 | [79, 370] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_003527__690.json | 0.0 | missing | missing | missing | |
| 2650 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231225_030342__432 | 0 | 0.0 | 4.32645 | 0 | [83, 245] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_030342__432.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2651 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_030345__982 | 0 | 0.0 | 3.56945 | 0 | [83, 201] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_030345__982.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2652 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231226_232824__354 | 0 | 0.0 | 2.95855 | 0 | [83, 165] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_232824__354.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2653 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_003515__389 | 0 | 0.0 | 8.4558 | 0 | [108, 240] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_003515__389.json | 25.0 | missing | missing | missing | |
| 2654 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030335__599 | 0 | 0.0 | 1.35695 | 0 | [120, 65] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_030335__599.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2655 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_030337__991 | 0 | 0.0 | 2.05819 | 0 | [120, 106] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_030337__991.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2656 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232821__288 | 0 | 0.0 | 1.9188 | 0 | [120, 98] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_232821__288.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2657 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_003506__618 | 0 | 0.0 | 16.7829 | 0 | [184, 454] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003506__618.json | 25.0 | missing | missing | missing | |
| 2658 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_030330__125 | 0 | 0.0 | 7.49792 | 0 | [192, 240] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030330__125.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2659 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_030334__364 | 0 | 0.0 | 4.26658 | 3 | [192, 220] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_030334__364.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2660 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_232819__499 | 0 | 0.0 | 5.26938 | 0 | [192, 123] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232819__499.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2661 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_003607__717 | 0 | 0.0 | 17.0085 | 0 | [11, 464] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003607__717.json | 50.0 | missing | missing | missing | |
| 2662 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_030409__270 | 0 | 0.0 | 7.32203 | 0 | [370, 336] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030409__270.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2663 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_030415__860 | 0 | 0.0 | 5.9128 | 0 | [370, 263] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_030415__860.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2664 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232832__823 | 0 | 0.0 | 3.74105 | 3 | [370, 148] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232832__823.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2665 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_003550__880 | 0 | 0.0 | 17.337 | 0 | [379, 389] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_003550__880.json | 50.0 | missing | missing | missing | |
| 2666 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_030357__473 | 0 | 0.0 | 3.67284 | 0 | [368, 144] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_030357__473.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2667 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_030401__575 | 0 | 0.0 | 3.93186 | 0 | [368, 159] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_030401__575.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2668 | Apple-MacBook-Pro-M1 | clean_column | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232828__899 | 0 | 0.0 | 4.71995 | 0 | [368, 200] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_232828__899.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2669 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231214_002922__138 | 4 | 0.0 | 9.71859 | 3 | [79, 289] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_002922__138.json | 95.0 | missing | missing | missing | |
| 2670 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_025011__965 | 0 | 0.0 | 6.23612 | 0 | [83, 194] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_025011__965.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2671 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_025016__463 | 0 | 0.0 | 4.9652 | 0 | [83, 152] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_025016__463.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2672 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231226_232234__649 | 5 | 0.0 | 4.88108 | 3 | [83, 150] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_232234__649.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2673 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_002913__461 | 0 | 0.0 | 5.60071 | 0 | [108, 153] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_002913__461.json | 50.0 | missing | missing | missing | |
| 2674 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_024958__389 | 0 | 0.0 | 4.60681 | 0 | [124, 135] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_024958__389.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2675 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_025005__574 | 0 | 0.0 | 6.25347 | 0 | [124, 190] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_025005__574.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2676 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232229__985 | 0 | 0.0 | 5.13547 | 0 | [124, 153] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_232229__985.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2677 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_002907__410 | 0 | 0.0 | 12.4337 | 0 | [184, 333] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_002907__410.json | 50.0 | missing | missing | missing | |
| 2678 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_024944__237 | 0 | 0.0 | 21.0784 | 0 | [200, 472] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024944__237.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2679 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_024954__438 | 2 | 0.0 | 9.89302 | 3 | [200, 292] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_024954__438.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2680 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_232223__113 | 0 | 0.0 | 16.7775 | 0 | [200, 350] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232223__113.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2681 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_003014__327 | 0 | 0.0 | 20.3963 | 0 | [11, 551] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003014__327.json | 0.0 | missing | missing | missing | |
| 2682 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_025051__587 | 5 | 0.0 | 7.85462 | 3 | [391, 194] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_025051__587.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2683 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_025059__745 | 0 | 0.0 | 8.63762 | 0 | [391, 218] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_025059__745.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2684 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232301__241 | 0 | 0.0 | 14.0533 | 0 | [391, 389] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232301__241.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2685 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_002954__785 | 0 | 0.0 | 17.8291 | 0 | [379, 402] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_002954__785.json | 50.0 | missing | missing | missing | |
| 2686 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_025037__410 | 0 | 0.0 | 6.58774 | 0 | [389, 153] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_025037__410.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2687 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_025043__807 | 0 | 0.0 | 5.64814 | 0 | [389, 123] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_025043__807.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2688 | Apple-MacBook-Pro-M1 | clean_column | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232247__844 | 0 | 0.0 | 12.9514 | 0 | [389, 355] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_232247__844.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2689 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | InJulia | 1SHOT | false | false | 5 | 20231214_003045__284 | 0 | 0.0 | 10.2376 | 0 | [79, 304] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_003045__284.json | 0.0 | missing | missing | missing | |
| 2690 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_025300__758 | 5 | 0.0 | 40.4818 | 3 | [78, 302] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_025300__758.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2691 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_025341__510 | 5 | 0.0 | 40.8867 | 3 | [78, 305] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_025341__510.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2692 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231226_232440__994 | 0 | 0.0 | 48.4699 | 2 | [78, 365] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_232440__994.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 2693 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_003035__707 | 0 | 0.0 | 5.2328 | 0 | [108, 142] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_003035__707.json | 0.0 | missing | missing | missing | |
| 2694 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_025157__364 | 0 | 0.0 | 8.5765 | 0 | [117, 47] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_025157__364.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2695 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_025219__831 | 0 | 0.0 | 21.7883 | 0 | [117, 151] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_025219__831.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2696 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_232352__814 | 0 | 0.0 | 7.6409 | 3 | [117, 40] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_232352__814.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2697 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_003030__126 | 0 | 0.0 | 15.7708 | 0 | [184, 427] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003030__126.json | 50.0 | missing | missing | missing | |
| 2698 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_025136__780 | 0 | 0.0 | 36.6313 | 3 | [192, 65] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_025136__780.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2699 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_025149__717 | 0 | 0.0 | 12.572 | 0 | [192, 68] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_025149__717.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2700 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_232344__584 | 5 | 0.0 | 43.1168 | 3 | [192, 132] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_232344__584.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2701 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_003122__420 | 0 | 0.0 | 18.5528 | 0 | [11, 504] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003122__420.json | 0.0 | missing | missing | missing | |
| 2702 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_025713__774 | 5 | 0.0 | 64.9576 | 3 | [391, 422] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_025713__774.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2703 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_025832__722 | 0 | 0.0 | 78.7542 | 0 | [391, 522] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_025832__722.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2704 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_232632__423 | 0 | 0.0 | 62.6121 | 3 | [391, 407] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_232632__423.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2705 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_003103__187 | 0 | 0.0 | 11.4866 | 0 | [379, 231] | 0.4.0 | 3 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_003103__187.json | 25.0 | missing | missing | missing | |
| 2706 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_025534__415 | 5 | 0.0 | 36.3488 | 3 | [389, 211] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_025534__415.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2707 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_025608__637 | 0 | 0.0 | 33.4201 | 0 | [389, 189] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_025608__637.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2708 | Apple-MacBook-Pro-M1 | clean_column | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_232529__967 | 5 | 0.0 | 49.2928 | 3 | [389, 309] | 0.6.0 | 3 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/clean_column/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_232529__967.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2709 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231214_004626__458 | 1 | 0.0 | 15.8863 | 1 | [124, 453] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_004626__458.json | 60.0 | missing | missing | missing | |
| 2710 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_171358__906 | 1 | 0.0 | 14.8461 | 1 | [132, 259] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_171358__906.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2711 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_171426__563 | 0 | 0.0 | 27.7913 | 0 | [132, 497] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_171426__563.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2712 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231226_235127__501 | 0 | 0.0 | 14.9984 | 0 | [132, 260] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231226_235127__501.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2713 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_004610__809 | 1 | 0.0 | 14.227 | 1 | [153, 395] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_004610__809.json | 60.0 | missing | missing | missing | |
| 2714 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_171330__504 | 0 | 0.0 | 8.51782 | 0 | [170, 134] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_171330__504.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2715 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_171343__472 | 4 | 0.0 | 12.6566 | 5 | [170, 212] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_171343__472.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2716 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_235112__303 | 0 | 0.0 | 16.6936 | 0 | [170, 286] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231226_235112__303.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2717 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_004556__249 | 0 | 0.0 | 19.7405 | 0 | [300, 485] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_004556__249.json | 25.0 | missing | missing | missing | |
| 2718 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_171302__929 | 3 | 0.0 | 23.8435 | 5 | [318, 214] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_171302__929.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2719 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_171322__533 | 0 | 0.0 | 19.432 | 0 | [318, 312] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_171322__533.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2720 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_235055__384 | 1 | 0.0 | 26.2682 | 1 | [318, 264] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231226_235055__384.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2721 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_004703__540 | 0 | 0.0 | 7.64938 | 0 | [11, 210] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_004703__540.json | 0.0 | missing | missing | missing | |
| 2722 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_171635__893 | 0 | 0.0 | 36.2926 | 0 | [435, 580] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_171635__893.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2723 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_171701__848 | 0 | 0.0 | 25.7624 | 0 | [435, 399] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_171701__848.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2724 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_235219__988 | 0 | 0.0 | 22.6399 | 0 | [435, 343] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235219__988.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2725 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_004656__697 | 0 | 0.0 | 15.0089 | 0 | [424, 305] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_004656__697.json | 0.0 | missing | missing | missing | |
| 2726 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_171540__588 | 1 | 0.0 | 28.3116 | 1 | [432, 444] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_171540__588.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2727 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_171558__720 | 5 | 0.0 | 18.3214 | 5 | [432, 268] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_171558__720.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2728 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_235157__144 | 0 | 0.0 | 30.0682 | 0 | [432, 472] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231226_235157__144.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2729 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231214_004803__561 | 0 | 0.0 | 14.747 | 0 | [124, 422] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_004803__561.json | 25.0 | missing | missing | missing | |
| 2730 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_171750__214 | 0 | 0.0 | 8.59846 | 0 | [106, 146] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_171750__214.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2731 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_171809__475 | 0 | 0.0 | 19.3163 | 0 | [106, 348] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_171809__475.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2732 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_004748__480 | 0 | 0.0 | 17.4715 | 0 | [153, 485] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_004748__480.json | 25.0 | missing | missing | missing | |
| 2733 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_171733__763 | 0 | 0.0 | 8.12322 | 0 | [107, 137] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_171733__763.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2734 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_171741__337 | 0 | 0.0 | 8.28883 | 0 | [107, 140] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_171741__337.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2735 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_004731__703 | 1 | 0.0 | 27.5178 | 1 | [300, 683] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_004731__703.json | 60.0 | missing | missing | missing | |
| 2736 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_171717__401 | 0 | 0.0 | 16.5429 | 0 | [193, 95] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_171717__401.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2737 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_171725__492 | 0 | 0.0 | 7.32047 | 0 | [193, 106] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_171725__492.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2738 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_004935__515 | 0 | 0.0 | 34.5578 | 0 | [11, 880] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_004935__515.json | 25.0 | missing | missing | missing | |
| 2739 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_172135__218 | 0 | 0.0 | 1.61038 | 0 | [124, 11] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172135__218.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2740 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_172137__347 | 0 | 0.0 | 1.55744 | 0 | [124, 10] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172137__347.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2741 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_004901__206 | 0 | 0.0 | 35.6118 | 0 | [424, 817] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_004901__206.json | 25.0 | missing | missing | missing | |
| 2742 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_172121__763 | 0 | 0.0 | 40.5756 | 0 | [121, 728] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_172121__763.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2743 | Apple-MacBook-Pro-M1 | event_scheduler | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_172133__766 | 0 | 0.0 | 12.5963 | 0 | [121, 221] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_172133__766.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2744 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_231759__698 | 0 | 0.0 | 19.4511 | 0 | [1, 576] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_231759__698.json | 25.0 | missing | missing | missing | |
| 2745 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_231818__233 | 0 | 0.0 | 19.0131 | 0 | [1, 564] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_231818__233.json | 25.0 | missing | missing | missing | |
| 2746 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_174957__875 | 5 | 0.0 | 90.3689 | 5 | [122, 530] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_174957__875.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2747 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_175045__697 | 0 | 0.0 | 47.2121 | 0 | [122, 269] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_175045__697.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2748 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_000329__576 | 5 | 0.0 | 96.8071 | 5 | [122, 580] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_000329__576.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2749 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_231707__766 | 0 | 0.0 | 9.01952 | 0 | [1, 277] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_231707__766.json | 0.0 | missing | missing | missing | |
| 2750 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_231723__534 | 0 | 0.0 | 15.3504 | 0 | [1, 458] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_231723__534.json | 0.0 | missing | missing | missing | |
| 2751 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_174723__299 | 5 | 0.0 | 62.123 | 5 | [163, 360] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_174723__299.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2752 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_174826__961 | 5 | 0.0 | 61.5501 | 5 | [163, 351] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_174826__961.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2753 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_000151__115 | 5 | 0.0 | 73.8858 | 5 | [163, 431] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_000151__115.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2754 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_231633__429 | 0 | 0.0 | 12.5158 | 0 | [1, 361] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_231633__429.json | 25.0 | missing | missing | missing | |
| 2755 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_231642__263 | 0 | 0.0 | 9.43716 | 0 | [1, 276] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_231642__263.json | 25.0 | missing | missing | missing | |
| 2756 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_174509__530 | 2 | 0.0 | 100.043 | 4 | [310, 400] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_174509__530.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2757 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_174621__951 | 0 | 0.0 | 71.1954 | 0 | [310, 389] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_174621__951.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2758 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_000036__389 | 1 | 0.0 | 90.759 | 1 | [310, 369] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_000036__389.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2759 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_232119__551 | 0 | 0.0 | 20.8047 | 0 | [1, 560] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_232119__551.json | 25.0 | missing | missing | missing | |
| 2760 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_232144__711 | 0 | 0.0 | 25.1982 | 0 | [1, 668] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_232144__711.json | 25.0 | missing | missing | missing | |
| 2761 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_175643__125 | 4 | 0.0 | 90.9193 | 5 | [451, 479] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_175643__125.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2762 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_175811__721 | 5 | 0.0 | 86.2169 | 5 | [451, 448] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_175811__721.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2763 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_000611__536 | 5 | 0.0 | 84.2061 | 5 | [451, 439] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_000611__536.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2764 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_232007__537 | 0 | 0.0 | 22.9554 | 0 | [1, 614] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_232007__537.json | 25.0 | missing | missing | missing | |
| 2765 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_232040__375 | 0 | 0.0 | 32.9607 | 0 | [1, 851] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_232040__375.json | 25.0 | missing | missing | missing | |
| 2766 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_175409__355 | 0 | 0.0 | 82.4326 | 0 | [449, 429] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_175409__355.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2767 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_175511__215 | 5 | 0.0 | 61.3079 | 5 | [449, 304] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_175511__215.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2768 | Apple-MacBook-Pro-M1 | event_scheduler | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_000447__196 | 1 | 0.0 | 77.0941 | 1 | [449, 397] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_000447__196.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2769 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_001606__811 | 1 | 0.0 | 10.4637 | 1 | [121, 395] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_001606__811.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2770 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_112433__113 | 0 | 0.0 | 8.35668 | 0 | [121, 315] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_112433__113.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2771 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_112444__283 | 0 | 0.0 | 10.7785 | 0 | [121, 407] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_112444__283.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2772 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_112454__883 | 0 | 0.0 | 10.0575 | 0 | [121, 380] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_112454__883.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2773 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_001556__425 | 0 | 0.0 | 7.00391 | 0 | [158, 258] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_001556__425.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2774 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_112414__760 | 0 | 0.0 | 10.9344 | 0 | [158, 406] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_112414__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2775 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_112420__477 | 0 | 0.0 | 5.86859 | 0 | [158, 214] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_112420__477.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2776 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_112425__500 | 0 | 0.0 | 4.44336 | 0 | [158, 159] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_112425__500.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2777 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_001549__228 | 0 | 0.0 | 11.7197 | 0 | [276, 286] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001549__228.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2778 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_112344__373 | 0 | 0.0 | 13.6085 | 0 | [276, 360] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_112344__373.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2779 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_112348__921 | 1 | 0.0 | 4.29583 | 1 | [276, 134] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_112348__921.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2780 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_112403__475 | 0 | 0.0 | 14.9905 | 0 | [276, 526] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_112403__475.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2781 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_001626__654 | 0 | 0.0 | 13.6669 | 0 | [410, 449] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001626__654.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2782 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_112530__831 | 0 | 0.0 | 7.46289 | 0 | [410, 228] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112530__831.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2783 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_112535__772 | 1 | 0.0 | 5.46074 | 5 | [410, 155] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112535__772.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2784 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_112543__552 | 1 | 0.0 | 7.38378 | 1 | [410, 225] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_112543__552.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2785 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_001612__497 | 1 | 0.0 | 5.60151 | 1 | [407, 160] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_001612__497.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2786 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_112506__506 | 1 | 0.0 | 12.3622 | 1 | [407, 403] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_112506__506.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2787 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_112512__157 | 0 | 0.0 | 4.94238 | 0 | [407, 135] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_112512__157.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2788 | Apple-MacBook-Pro-M1 | event_scheduler | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_112522__286 | 0 | 0.0 | 10.2554 | 0 | [407, 329] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_112522__286.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2789 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | InJulia | 1SHOT | true | true | 5 | 20231214_003857__675 | 1 | 0.0 | 15.2401 | 1 | [124, 436] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__InJulia__1SHOT__20231214_003857__675.json | 60.0 | missing | missing | missing | |
| 2790 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_165128__210 | 0 | 0.0 | 19.0368 | 0 | [124, 543] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__InJulia__1SHOT__20231225_165128__210.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2791 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | InJulia | 1SHOT | true | true | 5 | 20231225_165142__736 | 1 | 0.0 | 14.7271 | 1 | [1, 447] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__InJulia__1SHOT__20231225_165142__736.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2792 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | InJulia | 1SHOT | false | false | 5 | 20231226_234205__117 | 0 | 0.0 | 13.2836 | 0 | [124, 386] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__InJulia__1SHOT__20231226_234205__117.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2793 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_003841__348 | 0 | 0.0 | 4.40315 | 0 | [153, 104] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_003841__348.json | 0.0 | missing | missing | missing | |
| 2794 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_165056__536 | 0 | 0.0 | 13.8085 | 0 | [153, 385] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_165056__536.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2795 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_165108__794 | 1 | 0.0 | 11.955 | 1 | [1, 363] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_165108__794.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2796 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_234152__841 | 1 | 0.0 | 16.9356 | 1 | [153, 478] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertAsk__1SHOT__20231226_234152__841.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2797 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_003837__137 | 0 | 0.0 | 16.0641 | 0 | [300, 387] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_003837__137.json | 25.0 | missing | missing | missing | |
| 2798 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_165019__283 | 0 | 0.0 | 22.6332 | 0 | [318, 433] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165019__283.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2799 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_165043__418 | 0 | 0.0 | 23.8932 | 0 | [1, 658] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165043__418.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2800 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_234135__616 | 0 | 0.0 | 14.9713 | 0 | [318, 233] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_234135__616.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2801 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_003958__927 | 0 | 0.0 | 15.1994 | 0 | [11, 412] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_003958__927.json | 0.0 | missing | missing | missing | |
| 2802 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_165323__755 | 0 | 0.0 | 22.2524 | 0 | [11, 592] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165323__755.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2803 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_165357__114 | 0 | 0.0 | 34.1486 | 0 | [1, 879] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165357__114.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2804 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_234236__775 | 0 | 0.0 | 18.4574 | 0 | [11, 502] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_234236__775.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2805 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_003943__938 | 0 | 0.0 | 30.8686 | 0 | [424, 705] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_003943__938.json | 0.0 | missing | missing | missing | |
| 2806 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_165238__201 | 1 | 0.0 | 24.5327 | 1 | [424, 553] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_165238__201.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2807 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_165300__606 | 0 | 0.0 | 22.4514 | 0 | [1, 602] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_165300__606.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2808 | Apple-MacBook-Pro-M1 | event_scheduler | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_234218__377 | 0 | 0.0 | 12.3287 | 0 | [424, 237] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/llama2/evaluation__JuliaRecapTask__1SHOT__20231226_234218__377.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2809 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | InJulia | 1SHOT | true | true | 5 | 20231214_005032__550 | 1 | 0.0 | 16.6628 | 1 | [124, 475] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__InJulia__1SHOT__20231214_005032__550.json | 60.0 | missing | missing | missing | |
| 2810 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_172250__376 | 0 | 0.0 | 16.52 | 0 | [124, 535] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__InJulia__1SHOT__20231225_172250__376.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2811 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_172301__325 | 1 | 0.0 | 10.0953 | 1 | [124, 325] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__InJulia__1SHOT__20231225_172301__325.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2812 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | InJulia | 1SHOT | true | true | 5 | 20231226_235303__252 | 1 | 0.0 | 11.746 | 1 | [124, 377] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__InJulia__1SHOT__20231226_235303__252.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2813 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_005015__540 | 0 | 0.0 | 10.9916 | 0 | [153, 303] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_005015__540.json | 0.0 | missing | missing | missing | |
| 2814 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_172221__340 | 0 | 0.0 | 10.6509 | 0 | [163, 333] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_172221__340.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2815 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_172233__463 | 2 | 0.0 | 12.2684 | 4 | [163, 387] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_172233__463.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2816 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_235251__342 | 0 | 0.0 | 14.0741 | 0 | [163, 442] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231226_235251__342.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2817 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_005004__757 | 0 | 0.0 | 29.1408 | 0 | [300, 723] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005004__757.json | 25.0 | missing | missing | missing | |
| 2818 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_172156__868 | 0 | 0.0 | 19.6631 | 0 | [310, 402] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172156__868.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2819 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_172210__362 | 0 | 0.0 | 13.0568 | 0 | [310, 384] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172210__362.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2820 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_235237__864 | 1 | 0.0 | 17.3382 | 1 | [310, 335] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231226_235237__864.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2821 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_005124__786 | 0 | 0.0 | 12.8878 | 0 | [11, 352] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_005124__786.json | 0.0 | missing | missing | missing | |
| 2822 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_172407__211 | 0 | 0.0 | 11.9519 | 0 | [427, 324] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172407__211.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2823 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_172425__934 | 0 | 0.0 | 18.2161 | 0 | [427, 515] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172425__934.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2824 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_235332__786 | 3 | 0.0 | 15.0047 | 5 | [427, 418] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235332__786.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2825 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_005111__137 | 1 | 0.0 | 23.2406 | 1 | [424, 518] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_005111__137.json | 60.0 | missing | missing | missing | |
| 2826 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_172340__967 | 1 | 0.0 | 18.6923 | 1 | [424, 534] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_172340__967.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2827 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_172354__191 | 0 | 0.0 | 14.2659 | 0 | [424, 398] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_172354__191.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2828 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_235317__579 | 0 | 0.0 | 13.4932 | 0 | [424, 371] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder/evaluation__JuliaRecapTask__1SHOT__20231226_235317__579.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2829 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180522__266 | 0 | 0.0 | 14.3524 | 0 | [124, 272] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180522__266.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2830 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180539__517 | 0 | 0.0 | 16.9282 | 0 | [124, 323] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180539__517.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2831 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180556__892 | 1 | 0.0 | 17.2363 | 1 | [124, 329] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180556__892.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2832 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180426__479 | 1 | 0.0 | 20.0131 | 1 | [163, 377] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180426__479.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2833 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180443__579 | 1 | 0.0 | 17.2089 | 1 | [163, 322] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180443__579.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2834 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180507__664 | 1 | 0.0 | 23.1992 | 1 | [163, 432] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180507__664.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2835 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180322__815 | 1 | 0.0 | 24.1334 | 1 | [310, 437] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180322__815.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2836 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180339__589 | 4 | 0.0 | 16.5766 | 4 | [310, 292] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180339__589.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2837 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_180406__353 | 0 | 0.0 | 25.3634 | 0 | [310, 460] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180406__353.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2838 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180719__986 | 0 | 0.0 | 27.1019 | 0 | [427, 433] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180719__986.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2839 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180743__786 | 0 | 0.0 | 23.9632 | 0 | [427, 407] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180743__786.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2840 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_180801__977 | 0 | 0.0 | 17.861 | 0 | [427, 298] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_180801__977.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2841 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180608__770 | 1 | 0.0 | 11.6416 | 1 | [424, 179] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180608__770.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2842 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180634__509 | 1 | 0.0 | 25.177 | 1 | [424, 436] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180634__509.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2843 | Apple-MacBook-Pro-M1 | event_scheduler | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_180652__563 | 1 | 0.0 | 16.6992 | 1 | [424, 263] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_180652__563.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2844 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_232935__450 | 0 | 0.0 | 10.0142 | 0 | [1, 310] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_232935__450.json | 0.0 | missing | missing | missing | |
| 2845 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_232958__191 | 0 | 0.0 | 22.2451 | 0 | [1, 651] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_232958__191.json | 25.0 | missing | missing | missing | |
| 2846 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_180456__719 | 1 | 0.0 | 15.0538 | 1 | [120, 373] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_180456__719.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2847 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_180512__275 | 0 | 0.0 | 16.2533 | 0 | [120, 403] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_180512__275.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2848 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_000929__462 | 0 | 0.0 | 13.4852 | 0 | [120, 332] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_000929__462.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2849 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_232850__215 | 0 | 0.0 | 12.2316 | 0 | [1, 370] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_232850__215.json | 25.0 | missing | missing | missing | |
| 2850 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_232902__585 | 0 | 0.0 | 12.0173 | 0 | [1, 364] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_232902__585.json | 25.0 | missing | missing | missing | |
| 2851 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_180430__857 | 0 | 0.0 | 10.8148 | 0 | [161, 255] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_180430__857.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2852 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_180441__972 | 0 | 0.0 | 10.32 | 0 | [161, 243] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_180441__972.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2853 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_000916__411 | 0 | 0.0 | 12.8679 | 0 | [161, 307] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_000916__411.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2854 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_232806__214 | 0 | 0.0 | 11.2386 | 0 | [1, 326] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_232806__214.json | 25.0 | missing | missing | missing | |
| 2855 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_232818__122 | 0 | 0.0 | 11.4216 | 0 | [1, 331] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_232818__122.json | 25.0 | missing | missing | missing | |
| 2856 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_180401__953 | 0 | 0.0 | 21.9271 | 0 | [308, 373] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_180401__953.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2857 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_180419__841 | 1 | 0.0 | 18.1683 | 1 | [308, 418] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_180419__841.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2858 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_000903__300 | 0 | 0.0 | 38.8534 | 0 | [308, 785] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_000903__300.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2859 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_233332__996 | 0 | 0.0 | 28.7447 | 0 | [1, 752] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_233332__996.json | 25.0 | missing | missing | missing | |
| 2860 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_233356__334 | 0 | 0.0 | 23.5894 | 0 | [1, 629] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_233356__334.json | 0.0 | missing | missing | missing | |
| 2861 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_180641__480 | 1 | 0.0 | 17.3299 | 1 | [428, 374] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_180641__480.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2862 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_180704__912 | 0 | 0.0 | 23.2448 | 0 | [428, 519] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_180704__912.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2863 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_001032__574 | 0 | 0.0 | 46.1297 | 0 | [428, 1047] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001032__574.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2864 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_233203__922 | 0 | 0.0 | 49.1653 | 0 | [1, 1206] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_233203__922.json | 25.0 | missing | missing | missing | |
| 2865 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_233237__756 | 0 | 0.0 | 33.2246 | 0 | [1, 857] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_233237__756.json | 25.0 | missing | missing | missing | |
| 2866 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_180556__330 | 0 | 0.0 | 15.8858 | 0 | [426, 339] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180556__330.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2867 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_180623__499 | 0 | 0.0 | 26.9476 | 0 | [426, 608] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180623__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2868 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_000945__784 | 0 | 0.0 | 15.8975 | 0 | [426, 340] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_000945__784.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2869 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231227_232730__414 | 0 | 0.0 | 11.5068 | 0 | [119, 357] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_232730__414.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2870 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_232749__256 | 0 | 0.0 | 19.5137 | 0 | [119, 608] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_232749__256.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2871 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_232804__610 | 0 | 0.0 | 14.844 | 0 | [119, 463] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_232804__610.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2872 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_232817__598 | 4 | 0.0 | 12.4317 | 5 | [119, 387] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_232817__598.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2873 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_232836__531 | 0 | 0.0 | 18.1831 | 0 | [119, 567] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_232836__531.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2874 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_232645__534 | 0 | 0.0 | 7.35757 | 0 | [160, 218] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_232645__534.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2875 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_232654__558 | 0 | 0.0 | 8.37693 | 0 | [160, 251] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_232654__558.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2876 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_232702__196 | 0 | 0.0 | 8.71117 | 0 | [160, 262] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_232702__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2877 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_232708__339 | 0 | 0.0 | 6.12162 | 0 | [160, 178] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_232708__339.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2878 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_232718__729 | 0 | 0.0 | 9.59953 | 0 | [160, 290] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_232718__729.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2879 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_232525__878 | 1 | 0.0 | 13.7538 | 1 | [307, 373] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232525__878.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2880 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_232546__364 | 0 | 0.0 | 20.0305 | 0 | [307, 582] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232546__364.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2881 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_232602__748 | 0 | 0.0 | 16.1634 | 0 | [307, 462] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232602__748.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2882 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_232619__458 | 0 | 0.0 | 17.3022 | 0 | [307, 501] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232619__458.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2883 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_232638__504 | 0 | 0.0 | 18.4443 | 0 | [307, 534] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_232638__504.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2884 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233035__412 | 1 | 0.0 | 20.6368 | 1 | [427, 573] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233035__412.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2885 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233053__307 | 0 | 0.0 | 17.0421 | 0 | [427, 466] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233053__307.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2886 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_233104__897 | 0 | 0.0 | 10.7946 | 0 | [427, 277] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233104__897.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2887 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233118__722 | 0 | 0.0 | 14.6465 | 0 | [427, 395] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233118__722.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2888 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233141__665 | 0 | 0.0 | 23.0017 | 0 | [427, 643] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233141__665.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2889 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_232851__923 | 0 | 0.0 | 15.3954 | 0 | [425, 418] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_232851__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2890 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_232905__956 | 0 | 0.0 | 14.021 | 0 | [425, 376] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_232905__956.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2891 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232931__453 | 0 | 0.0 | 25.6321 | 0 | [425, 719] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_232931__453.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2892 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_232952__493 | 0 | 0.0 | 20.7642 | 0 | [425, 578] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_232952__493.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2893 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_233015__144 | 0 | 0.0 | 23.1223 | 0 | [425, 646] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_233015__144.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2894 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_233434__325 | 0 | 0.0 | 12.4716 | 0 | [119, 305] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_233434__325.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2895 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_233449__351 | 0 | 0.0 | 15.5926 | 0 | [119, 384] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_233449__351.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2896 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_233505__619 | 0 | 0.0 | 16.1424 | 0 | [119, 398] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_233505__619.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2897 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_233519__748 | 0 | 0.0 | 14.0446 | 0 | [119, 345] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_233519__748.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2898 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_233539__787 | 0 | 0.0 | 19.6661 | 0 | [119, 486] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_233539__787.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2899 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_233346__561 | 0 | 0.0 | 5.45156 | 0 | [160, 120] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_233346__561.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2900 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_233357__795 | 0 | 0.0 | 11.4513 | 0 | [160, 274] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_233357__795.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2901 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_233403__638 | 0 | 0.0 | 5.72493 | 0 | [160, 127] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_233403__638.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2902 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_233409__849 | 0 | 0.0 | 6.29588 | 0 | [160, 142] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_233409__849.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2903 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_233421__315 | 0 | 0.0 | 11.5734 | 0 | [160, 277] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_233421__315.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2904 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_233204__619 | 0 | 0.0 | 23.0643 | 0 | [307, 515] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233204__619.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2905 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_233220__885 | 0 | 0.0 | 15.6309 | 0 | [307, 353] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233220__885.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2906 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_233247__469 | 0 | 0.0 | 26.6227 | 0 | [307, 620] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233247__469.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2907 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_233308__549 | 1 | 0.0 | 21.672 | 1 | [307, 501] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233308__549.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2908 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_233340__444 | 0 | 0.0 | 31.8406 | 0 | [307, 744] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233340__444.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2909 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233807__520 | 0 | 0.0 | 17.1494 | 0 | [427, 369] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233807__520.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2910 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233823__705 | 0 | 0.0 | 16.1236 | 0 | [427, 344] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233823__705.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2911 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_233851__681 | 0 | 0.0 | 27.801 | 0 | [427, 624] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233851__681.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2912 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233908__998 | 1 | 0.0 | 17.6026 | 1 | [427, 380] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233908__998.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2913 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_233934__350 | 0 | 0.0 | 26.0185 | 0 | [427, 582] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_233934__350.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2914 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_233607__168 | 1 | 0.0 | 27.6262 | 1 | [425, 620] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_233607__168.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2915 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_233624__487 | 0 | 0.0 | 16.3862 | 0 | [425, 350] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_233624__487.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2916 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_233642__305 | 0 | 0.0 | 18.0215 | 0 | [425, 390] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_233642__305.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2917 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_233718__288 | 0 | 0.0 | 36.3549 | 0 | [425, 823] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_233718__288.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2918 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_233749__660 | 0 | 0.0 | 31.3384 | 0 | [425, 707] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_233749__660.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2919 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_121612__690 | 0 | 0.0 | 24.2121 | 0 | [119, 442] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121612__690.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2920 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_121632__295 | 0 | 0.0 | 19.797 | 0 | [119, 357] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_121632__295.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2921 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_001435__149 | 0 | 0.0 | 16.1957 | 0 | [119, 292] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_001435__149.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2922 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_121529__657 | 0 | 0.0 | 7.85832 | 0 | [160, 132] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_121529__657.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2923 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_121548__209 | 0 | 0.0 | 17.874 | 0 | [160, 320] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_121548__209.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2924 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_001419__637 | 0 | 0.0 | 8.13928 | 0 | [160, 137] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_001419__637.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2925 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_121457__546 | 0 | 0.0 | 19.1497 | 0 | [307, 324] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_121457__546.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2926 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_121521__772 | 0 | 0.0 | 23.6948 | 0 | [307, 407] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_121521__772.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2927 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_001411__924 | 0 | 0.0 | 41.7223 | 1 | [307, 573] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001411__924.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2928 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_121906__891 | 0 | 0.0 | 38.7066 | 0 | [427, 658] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121906__891.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2929 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_121939__595 | 0 | 0.0 | 33.1704 | 0 | [427, 560] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_121939__595.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2930 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_001537__404 | 0 | 0.0 | 24.0853 | 0 | [427, 396] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001537__404.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2931 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_121747__678 | 0 | 0.0 | 32.5271 | 0 | [425, 549] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121747__678.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2932 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_121827__743 | 0 | 0.0 | 40.0955 | 0 | [425, 683] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_121827__743.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2933 | Apple-MacBook-Pro-M1 | event_scheduler | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_001513__901 | 0 | 0.0 | 37.4031 | 0 | [425, 633] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_001513__901.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2934 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_113356__534 | 0 | 0.0 | 75.0369 | 0 | [123, 438] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_113356__534.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2935 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_113523__575 | 0 | 0.0 | 86.4353 | 0 | [123, 506] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_113523__575.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2936 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_113623__730 | 1 | 0.0 | 59.4673 | 1 | [123, 345] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_113623__730.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2937 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_150553__665 | 1 | 0.0 | 99.5454 | 1 | [123, 581] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_150553__665.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2938 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_150710__959 | 1 | 0.0 | 76.665 | 1 | [123, 446] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_150710__959.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2939 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_113139__445 | 1 | 0.0 | 63.7852 | 1 | [162, 361] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_113139__445.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2940 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_113204__941 | 0 | 0.0 | 24.1518 | 0 | [162, 121] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_113204__941.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2941 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_113241__663 | 0 | 0.0 | 37.4115 | 0 | [162, 202] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_113241__663.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2942 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_150321__206 | 1 | 0.0 | 27.0897 | 1 | [162, 138] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_150321__206.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2943 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_150413__935 | 0 | 0.0 | 52.1942 | 0 | [162, 290] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_150413__935.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2944 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_112734__236 | 1 | 0.0 | 110.792 | 1 | [313, 586] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_112734__236.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2945 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_112854__178 | 1 | 0.0 | 79.0073 | 1 | [313, 426] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_112854__178.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2946 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_113035__656 | 0 | 0.0 | 101.556 | 0 | [313, 557] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_113035__656.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2947 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_150034__588 | 0 | 0.0 | 123.306 | 1 | [313, 679] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_150034__588.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2948 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_150253__142 | 1 | 0.0 | 138.956 | 1 | [313, 767] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_150253__142.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2949 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_114041__402 | 0 | 0.0 | 33.249 | 0 | [436, 132] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114041__402.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2950 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_114239__445 | 1 | 0.0 | 118.598 | 1 | [436, 627] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114239__445.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2951 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_114428__896 | 1 | 0.0 | 108.713 | 1 | [436, 571] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114428__896.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2952 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_151151__131 | 1 | 0.0 | 86.9161 | 1 | [436, 444] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_151151__131.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2953 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_151312__192 | 4 | 0.0 | 80.0327 | 5 | [436, 404] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_151312__192.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2954 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_113821__130 | 0 | 0.0 | 118.749 | 0 | [434, 628] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_113821__130.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2955 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_113923__606 | 4 | 0.0 | 61.8981 | 5 | [434, 301] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_113923__606.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2956 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_114008__725 | 1 | 0.0 | 43.8481 | 1 | [434, 195] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_114008__725.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2957 | Apple-MacBook-Pro-M1 | event_scheduler | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_151024__562 | 0 | 0.0 | 44.2058 | 0 | [434, 195] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_151024__562.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2958 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_233603__628 | 0 | 0.0 | 15.8316 | 0 | [1, 477] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_233603__628.json | 25.0 | missing | missing | missing | |
| 2959 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_233624__727 | 0 | 0.0 | 21.2782 | 0 | [1, 625] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231219_233624__727.json | 0.0 | missing | missing | missing | |
| 2960 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_180826__120 | 1 | 0.0 | 18.1424 | 1 | [128, 450] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_180826__120.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2961 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_180842__179 | 1 | 0.0 | 15.5197 | 1 | [128, 383] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_180842__179.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2962 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_001122__716 | 0 | 0.0 | 12.1005 | 0 | [128, 296] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_001122__716.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2963 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_233519__803 | 0 | 0.0 | 10.7559 | 0 | [1, 328] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_233519__803.json | 0.0 | missing | missing | missing | |
| 2964 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_233529__356 | 0 | 0.0 | 10.0393 | 0 | [1, 307] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_233529__356.json | 0.0 | missing | missing | missing | |
| 2965 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_180757__848 | 0 | 0.0 | 9.85069 | 0 | [169, 230] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_180757__848.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2966 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_180808__640 | 0 | 0.0 | 10.9163 | 0 | [169, 257] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_180808__640.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2967 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_001110__571 | 0 | 0.0 | 8.85653 | 0 | [169, 204] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_001110__571.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2968 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_233441__733 | 0 | 0.0 | 18.8836 | 0 | [1, 530] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_233441__733.json | 25.0 | missing | missing | missing | |
| 2969 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_233456__286 | 0 | 0.0 | 14.8199 | 0 | [1, 423] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_233456__286.json | 0.0 | missing | missing | missing | |
| 2970 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_180729__193 | 0 | 0.0 | 24.8618 | 0 | [316, 425] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_180729__193.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2971 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_180747__261 | 0 | 0.0 | 17.0523 | 0 | [316, 389] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_180747__261.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2972 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_001101__240 | 0 | 0.0 | 29.219 | 0 | [316, 540] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001101__240.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2973 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_233905__396 | 0 | 0.0 | 23.5968 | 0 | [1, 629] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_233905__396.json | 25.0 | missing | missing | missing | |
| 2974 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_233937__742 | 0 | 0.0 | 31.9501 | 0 | [1, 827] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_233937__742.json | 25.0 | missing | missing | missing | |
| 2975 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_181009__873 | 4 | 0.0 | 24.1206 | 5 | [436, 539] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181009__873.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2976 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_181031__157 | 0 | 0.0 | 20.8991 | 0 | [436, 459] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181031__157.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2977 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_001233__127 | 0 | 0.0 | 32.4517 | 0 | [436, 733] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001233__127.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2978 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_233807__919 | 0 | 0.0 | 27.6954 | 0 | [1, 728] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_233807__919.json | 0.0 | missing | missing | missing | |
| 2979 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_233825__788 | 0 | 0.0 | 18.2046 | 0 | [1, 496] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_233825__788.json | 25.0 | missing | missing | missing | |
| 2980 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_180933__953 | 1 | 0.0 | 22.7427 | 1 | [434, 506] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180933__953.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2981 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_180945__169 | 4 | 0.0 | 11.2036 | 5 | [434, 224] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180945__169.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2982 | Apple-MacBook-Pro-M1 | event_scheduler | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_001200__504 | 0 | 0.0 | 38.1619 | 0 | [434, 865] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_001200__504.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2983 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231214_004050__464 | 1 | 0.0 | 18.208 | 1 | [124, 518] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_004050__464.json | 60.0 | missing | missing | missing | |
| 2984 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_165506__865 | 1 | 0.0 | 9.55682 | 1 | [126, 301] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_165506__865.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2985 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_165519__279 | 0 | 0.0 | 13.2302 | 0 | [126, 420] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_165519__279.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2986 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231226_234315__817 | 1 | 0.0 | 9.52248 | 1 | [126, 298] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231226_234315__817.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2987 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_004031__803 | 1 | 0.0 | 13.9204 | 1 | [153, 386] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_004031__803.json | 60.0 | missing | missing | missing | |
| 2988 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_165447__515 | 1 | 0.0 | 13.2425 | 1 | [167, 409] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_165447__515.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2989 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_165456__355 | 1 | 0.0 | 7.83331 | 1 | [167, 234] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_165456__355.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2990 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_234306__420 | 1 | 0.0 | 11.4801 | 1 | [167, 351] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231226_234306__420.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2991 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_004017__321 | 0 | 0.0 | 19.1933 | 0 | [300, 470] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_004017__321.json | 25.0 | missing | missing | missing | |
| 2992 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_165418__321 | 0 | 0.0 | 20.5997 | 1 | [314, 454] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165418__321.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2993 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_165434__925 | 1 | 0.0 | 15.9658 | 1 | [314, 470] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165434__925.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2994 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_234254__531 | 4 | 0.0 | 17.7756 | 5 | [314, 369] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231226_234254__531.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2995 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_004152__451 | 1 | 0.0 | 23.106 | 1 | [11, 611] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_004152__451.json | 60.0 | missing | missing | missing | |
| 2996 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_165633__753 | 1 | 0.0 | 18.1781 | 1 | [434, 512] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165633__753.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2997 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_165645__978 | 0 | 0.0 | 11.6872 | 0 | [434, 312] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165645__978.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2998 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_234352__341 | 1 | 0.0 | 23.8949 | 1 | [434, 679] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231226_234352__341.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 2999 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_004129__509 | 0 | 0.0 | 21.8459 | 0 | [424, 483] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_004129__509.json | 0.0 | missing | missing | missing | |
| 3000 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_165557__815 | 1 | 0.0 | 15.1881 | 1 | [432, 419] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_165557__815.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3001 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_165615__757 | 0 | 0.0 | 17.4134 | 0 | [432, 489] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_165615__757.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3002 | Apple-MacBook-Pro-M1 | event_scheduler | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_234328__125 | 1 | 0.0 | 12.4309 | 1 | [432, 333] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231226_234328__125.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3003 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231214_005415__854 | 1 | 0.0 | 16.6729 | 1 | [124, 476] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__InJulia__1SHOT__20231214_005415__854.json | 60.0 | missing | missing | missing | |
| 3004 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_172843__310 | 0 | 0.0 | 2.66047 | 0 | [127, 31] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__InJulia__1SHOT__20231225_172843__310.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3005 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_172846__823 | 0 | 0.0 | 2.98491 | 0 | [127, 37] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__InJulia__1SHOT__20231225_172846__823.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3006 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231226_235520__196 | 0 | 0.0 | 2.86239 | 0 | [127, 35] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__InJulia__1SHOT__20231226_235520__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3007 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_005358__999 | 0 | 0.0 | 12.1132 | 1 | [153, 335] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_005358__999.json | 55.0 | missing | missing | missing | |
| 3008 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_172825__259 | 0 | 0.0 | 6.55695 | 0 | [166, 95] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_172825__259.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3009 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_172840__145 | 0 | 0.0 | 14.9424 | 0 | [166, 251] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_172840__145.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3010 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_235517__689 | 0 | 0.0 | 21.1746 | 0 | [166, 367] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231226_235517__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3011 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_005346__576 | 0 | 0.0 | 17.173 | 0 | [300, 417] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005346__576.json | 0.0 | missing | missing | missing | |
| 3012 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_172744__444 | 0 | 0.0 | 66.0561 | 0 | [313, 899] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172744__444.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3013 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_172819__250 | 0 | 0.0 | 34.2486 | 0 | [313, 559] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172819__250.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3014 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_235456__415 | 0 | 0.0 | 34.075 | 0 | [313, 410] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231226_235456__415.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3015 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_005520__633 | 0 | 0.0 | 18.6467 | 0 | [11, 500] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_005520__633.json | 0.0 | missing | missing | missing | |
| 3016 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_173147__898 | 0 | 0.0 | 58.5885 | 0 | [430, 927] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_173147__898.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3017 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_173304__854 | 0 | 0.0 | 76.9558 | 0 | [430, 1201] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_173304__854.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3018 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_235535__376 | 0 | 0.0 | 7.85653 | 0 | [430, 79] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235535__376.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3019 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_005502__814 | 0 | 0.0 | 30.196 | 0 | [424, 689] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_005502__814.json | 25.0 | missing | missing | missing | |
| 3020 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_173005__830 | 0 | 0.0 | 70.1601 | 0 | [427, 1092] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_173005__830.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3021 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_173048__551 | 0 | 0.0 | 43.6886 | 0 | [427, 691] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_173048__551.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3022 | Apple-MacBook-Pro-M1 | event_scheduler | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231226_235527__776 | 0 | 0.0 | 7.47177 | 0 | [427, 72] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231226_235527__776.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3023 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_234156__950 | 0 | 0.0 | 18.9972 | 0 | [1, 564] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_234156__950.json | 25.0 | missing | missing | missing | |
| 3024 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231219_234216__698 | 0 | 0.0 | 19.4042 | 0 | [1, 575] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231219_234216__698.json | 25.0 | missing | missing | missing | |
| 3025 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_181104__343 | 0 | 0.0 | 5.25884 | 0 | [115, 196] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_181104__343.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3026 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_181123__434 | 0 | 0.0 | 19.2087 | 0 | [115, 713] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_181123__434.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3027 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_001306__886 | 0 | 0.0 | 22.3478 | 0 | [115, 821] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_001306__886.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3028 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_234106__563 | 0 | 0.0 | 11.5896 | 0 | [1, 352] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_234106__563.json | 0.0 | missing | missing | missing | |
| 3029 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_234116__953 | 0 | 0.0 | 10.1056 | 0 | [1, 309] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231219_234116__953.json | 25.0 | missing | missing | missing | |
| 3030 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_181047__942 | 0 | 0.0 | 4.61234 | 0 | [152, 166] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_181047__942.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3031 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_181058__711 | 1 | 0.0 | 10.9637 | 1 | [152, 409] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_181058__711.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3032 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_001244__926 | 0 | 0.0 | 6.50601 | 0 | [152, 239] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_001244__926.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3033 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_234009__506 | 0 | 0.0 | 18.3373 | 0 | [1, 516] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234009__506.json | 25.0 | missing | missing | missing | |
| 3034 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_234036__119 | 0 | 0.0 | 27.4405 | 0 | [1, 745] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234036__119.json | 25.0 | missing | missing | missing | |
| 3035 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_181036__159 | 0 | 0.0 | 4.79623 | 0 | [270, 3] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_181036__159.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3036 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_181043__798 | 0 | 0.0 | 6.72911 | 0 | [270, 227] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_181043__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3037 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_001238__360 | 0 | 0.0 | 4.44613 | 0 | [270, 5] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001238__360.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3038 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_234513__364 | 0 | 0.0 | 25.9923 | 0 | [1, 687] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_234513__364.json | 25.0 | missing | missing | missing | |
| 3039 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_234539__401 | 0 | 0.0 | 26.4196 | 0 | [1, 697] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231219_234539__401.json | 25.0 | missing | missing | missing | |
| 3040 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_181256__982 | 0 | 0.0 | 17.8393 | 0 | [404, 594] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181256__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3041 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_181308__208 | 0 | 0.0 | 11.9465 | 0 | [404, 390] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_181308__208.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3042 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_001329__780 | 0 | 0.0 | 9.53648 | 0 | [404, 304] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001329__780.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3043 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_234415__646 | 0 | 0.0 | 32.0753 | 0 | [1, 831] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_234415__646.json | 0.0 | missing | missing | missing | |
| 3044 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_234433__682 | 0 | 0.0 | 17.8564 | 0 | [1, 487] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231219_234433__682.json | 0.0 | missing | missing | missing | |
| 3045 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_181223__231 | 0 | 0.0 | 12.5273 | 0 | [401, 411] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_181223__231.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3046 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_181238__195 | 0 | 0.0 | 15.5757 | 0 | [401, 517] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_181238__195.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3047 | Apple-MacBook-Pro-M1 | event_scheduler | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_001320__314 | 0 | 0.0 | 13.165 | 0 | [401, 432] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_001320__314.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3048 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231214_005610__117 | 1 | 0.0 | 18.8931 | 1 | [124, 537] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_005610__117.json | 60.0 | missing | missing | missing | |
| 3049 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_173720__374 | 0 | 0.0 | 39.386 | 0 | [135, 294] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_173720__374.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3050 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231225_173817__972 | 0 | 0.0 | 57.1932 | 0 | [135, 436] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_173817__972.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3051 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231226_235827__869 | 0 | 0.0 | 31.0459 | 0 | [135, 225] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231226_235827__869.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3052 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_005550__659 | 1 | 0.0 | 13.4789 | 1 | [153, 374] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_005550__659.json | 60.0 | missing | missing | missing | |
| 3053 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_173547__954 | 1 | 0.0 | 56.8508 | 1 | [174, 412] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_173547__954.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3054 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_173641__971 | 0 | 0.0 | 52.5665 | 0 | [174, 392] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_173641__971.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3055 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_235755__181 | 2 | 0.0 | 51.5663 | 5 | [174, 381] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231226_235755__181.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3056 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_005537__435 | 1 | 0.0 | 16.3369 | 1 | [300, 394] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005537__435.json | 60.0 | missing | missing | missing | |
| 3057 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_173420__725 | 3 | 0.0 | 76.0186 | 5 | [321, 357] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_173420__725.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3058 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_173451__127 | 0 | 0.0 | 29.969 | 0 | [321, 178] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_173451__127.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3059 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_235704__831 | 0 | 0.0 | 88.3907 | 0 | [321, 476] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231226_235704__831.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3060 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_005717__118 | 0 | 0.0 | 22.4306 | 0 | [11, 593] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_005717__118.json | 0.0 | missing | missing | missing | |
| 3061 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_174240__600 | 0 | 0.0 | 56.5199 | 0 | [438, 372] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_174240__600.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3062 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_174328__870 | 1 | 0.0 | 48.7537 | 1 | [438, 312] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_174328__870.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3063 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_235906__726 | 0 | 0.0 | 9.69317 | 0 | [438, 5] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235906__726.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3064 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_005655__751 | 0 | 0.0 | 22.542 | 0 | [424, 499] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_005655__751.json | 0.0 | missing | missing | missing | |
| 3065 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_174046__719 | 1 | 0.0 | 42.6473 | 5 | [435, 265] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_174046__719.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3066 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_174143__819 | 0 | 0.0 | 56.3974 | 0 | [435, 371] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_174143__819.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3067 | Apple-MacBook-Pro-M1 | event_scheduler | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_235856__790 | 0 | 0.0 | 29.1137 | 0 | [435, 159] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231226_235856__790.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3068 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_232336__240 | 0 | 0.0 | 15.4543 | 0 | [1, 466] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_232336__240.json | 25.0 | missing | missing | missing | |
| 3069 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231219_232351__643 | 0 | 0.0 | 15.1844 | 0 | [1, 459] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_232351__643.json | 0.0 | missing | missing | missing | |
| 3070 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_180010__271 | 0 | 0.0 | 30.5155 | 5 | [128, 511] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_180010__271.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3071 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_180109__434 | 0 | 0.0 | 58.5187 | 5 | [128, 967] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_180109__434.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3072 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_000724__983 | 1 | 0.0 | 24.3558 | 1 | [128, 404] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_000724__983.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3073 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231219_232250__286 | 0 | 0.0 | 7.98963 | 0 | [1, 247] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_232250__286.json | 0.0 | missing | missing | missing | |
| 3074 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_232304__740 | 0 | 0.0 | 13.5668 | 0 | [1, 408] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_232304__740.json | 25.0 | missing | missing | missing | |
| 3075 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_175928__376 | 1 | 0.0 | 10.3057 | 5 | [169, 155] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_175928__376.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3076 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_175940__138 | 1 | 0.0 | 11.5181 | 1 | [169, 176] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_175940__138.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3077 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_000659__332 | 1 | 0.0 | 11.4472 | 1 | [169, 175] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_000659__332.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3078 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_232211__212 | 0 | 0.0 | 14.5891 | 0 | [1, 417] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_232211__212.json | 0.0 | missing | missing | missing | |
| 3079 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_232228__180 | 0 | 0.0 | 17.7035 | 0 | [1, 499] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_232228__180.json | 25.0 | missing | missing | missing | |
| 3080 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_175849__798 | 0 | 0.0 | 37.7977 | 0 | [316, 451] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_175849__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3081 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_175917__939 | 0 | 0.0 | 28.5244 | 0 | [316, 443] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_175917__939.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3082 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_000647__398 | 0 | 0.0 | 35.9159 | 0 | [316, 422] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_000647__398.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3083 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_232703__118 | 0 | 0.0 | 13.3089 | 0 | [1, 369] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_232703__118.json | 0.0 | missing | missing | missing | |
| 3084 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_232736__871 | 0 | 0.0 | 33.1367 | 0 | [1, 854] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_232736__871.json | 25.0 | missing | missing | missing | |
| 3085 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_180317__924 | 1 | 0.0 | 26.4295 | 1 | [436, 386] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_180317__924.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3086 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_180339__299 | 1 | 0.0 | 22.0969 | 1 | [436, 315] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_180339__299.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3087 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_000823__383 | 1 | 0.0 | 29.0012 | 1 | [436, 428] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_000823__383.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3088 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_232555__135 | 0 | 0.0 | 25.6916 | 0 | [1, 680] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_232555__135.json | 0.0 | missing | missing | missing | |
| 3089 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_232629__961 | 0 | 0.0 | 34.0013 | 0 | [1, 875] | 0.5.0-DEV | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_232629__961.json | 25.0 | missing | missing | missing | |
| 3090 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_180218__946 | 1 | 0.0 | 22.5102 | 1 | [434, 322] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180218__946.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3091 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_180250__967 | 1 | 0.0 | 31.5511 | 1 | [434, 470] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_180250__967.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3092 | Apple-MacBook-Pro-M1 | event_scheduler | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_000754__777 | 1 | 0.0 | 29.35 | 1 | [434, 434] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_000754__777.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3093 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231214_005220__524 | 0 | 0.0 | 12.9722 | 1 | [124, 371] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_005220__524.json | 55.0 | missing | missing | missing | |
| 3094 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_172521__633 | 0 | 0.0 | 17.7103 | 0 | [126, 904] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_172521__633.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3095 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_172528__391 | 0 | 0.0 | 7.04211 | 0 | [126, 384] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_172528__391.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3096 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231226_235401__979 | 0 | 0.0 | 9.67895 | 0 | [126, 528] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__InJulia__1SHOT__20231226_235401__979.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3097 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_005207__329 | 1 | 0.0 | 17.9101 | 1 | [153, 498] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_005207__329.json | 60.0 | missing | missing | missing | |
| 3098 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_172454__899 | 0 | 0.0 | 7.02696 | 0 | [163, 359] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_172454__899.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3099 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_172503__390 | 0 | 0.0 | 8.7292 | 0 | [163, 446] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_172503__390.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3100 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_235351__638 | 0 | 0.0 | 8.56118 | 0 | [163, 456] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231226_235351__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3101 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_005149__241 | 0 | 0.0 | 24.9243 | 0 | [300, 618] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005149__241.json | 25.0 | missing | missing | missing | |
| 3102 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_172440__490 | 0 | 0.0 | 14.4275 | 0 | [278, 571] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172440__490.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3103 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_172447__970 | 0 | 0.0 | 7.23043 | 0 | [278, 356] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_172447__970.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3104 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_235342__569 | 0 | 0.0 | 10.0353 | 0 | [278, 365] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231226_235342__569.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3105 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_005329__722 | 0 | 0.0 | 23.7351 | 0 | [11, 626] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_005329__722.json | 0.0 | missing | missing | missing | |
| 3106 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_172625__866 | 1 | 0.0 | 10.9192 | 1 | [413, 487] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172625__866.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3107 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_172638__552 | 0 | 0.0 | 12.92 | 0 | [413, 591] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_172638__552.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3108 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_235422__684 | 1 | 0.0 | 12.8238 | 1 | [413, 592] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235422__684.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3109 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_005305__630 | 0 | 0.0 | 25.7998 | 0 | [424, 582] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_005305__630.json | 0.0 | missing | missing | missing | |
| 3110 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_172557__137 | 0 | 0.0 | 17.8889 | 0 | [411, 797] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_172557__137.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3111 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_172614__689 | 0 | 0.0 | 16.7461 | 0 | [411, 755] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_172614__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3112 | Apple-MacBook-Pro-M1 | event_scheduler | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_235409__188 | 0 | 0.0 | 8.30775 | 0 | [411, 375] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231226_235409__188.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3113 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231214_004235__581 | 0 | 0.0 | 15.322 | 0 | [124, 438] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_004235__581.json | 25.0 | missing | missing | missing | |
| 3114 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_165806__662 | 1 | 0.0 | 16.2506 | 1 | [128, 516] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_165806__662.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3115 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_165820__520 | 0 | 0.0 | 13.6783 | 0 | [128, 433] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_165820__520.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3116 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231226_234439__145 | 1 | 0.0 | 12.605 | 1 | [128, 398] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__InJulia__1SHOT__20231226_234439__145.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3117 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_004219__757 | 1 | 0.0 | 12.47 | 1 | [153, 345] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_004219__757.json | 60.0 | missing | missing | missing | |
| 3118 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_165734__479 | 0 | 0.0 | 10.4342 | 0 | [169, 318] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_165734__479.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3119 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_165749__342 | 1 | 0.0 | 15.0129 | 1 | [169, 464] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_165749__342.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3120 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_234426__466 | 1 | 0.0 | 10.9511 | 1 | [169, 334] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231226_234426__466.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3121 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_004207__699 | 0 | 0.0 | 14.2614 | 0 | [300, 338] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_004207__699.json | 25.0 | missing | missing | missing | |
| 3122 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_165705__187 | 0 | 0.0 | 20.4789 | 0 | [316, 442] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165705__187.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3123 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_165724__419 | 0 | 0.0 | 18.3347 | 0 | [316, 542] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_165724__419.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3124 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_234415__188 | 0 | 0.0 | 22.9606 | 0 | [316, 522] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231226_234415__188.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3125 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_004340__484 | 3 | 0.0 | 21.0981 | 5 | [11, 561] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_004340__484.json | 90.0 | missing | missing | missing | |
| 3126 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_165925__835 | 0 | 0.0 | 11.342 | 0 | [436, 300] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165925__835.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3127 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_165953__830 | 0 | 0.0 | 28.1929 | 1 | [436, 807] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_165953__830.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3128 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_234508__422 | 1 | 0.0 | 16.5019 | 1 | [436, 457] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231226_234508__422.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3129 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_004318__627 | 0 | 0.0 | 22.4185 | 0 | [424, 497] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_004318__627.json | 0.0 | missing | missing | missing | |
| 3130 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_165905__595 | 0 | 0.0 | 22.6802 | 0 | [434, 645] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_165905__595.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3131 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_165913__771 | 0 | 0.0 | 8.44353 | 1 | [434, 208] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_165913__771.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3132 | Apple-MacBook-Pro-M1 | event_scheduler | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_234452__747 | 1 | 0.0 | 12.5569 | 1 | [434, 336] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231226_234452__747.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3133 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231214_004429__412 | 1 | 0.0 | 19.904 | 1 | [124, 564] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_004429__412.json | 60.0 | missing | missing | missing | |
| 3134 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_170458__552 | 0 | 0.0 | 79.645 | 0 | [123, 594] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_170458__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3135 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_170601__548 | 0 | 0.0 | 63.3382 | 0 | [123, 468] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_170601__548.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3136 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231226_234907__469 | 0 | 0.0 | 100.992 | 0 | [123, 748] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__InJulia__1SHOT__20231226_234907__469.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3137 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_004409__679 | 1 | 0.0 | 9.86623 | 1 | [153, 269] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_004409__679.json | 60.0 | missing | missing | missing | |
| 3138 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_170243__649 | 1 | 0.0 | 41.6704 | 1 | [162, 296] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_170243__649.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3139 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_170338__439 | 0 | 0.0 | 53.6046 | 0 | [162, 387] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_170338__439.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3140 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_234726__465 | 0 | 0.0 | 40.1088 | 0 | [162, 283] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231226_234726__465.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3141 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_004358__969 | 0 | 0.0 | 18.7159 | 0 | [300, 457] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_004358__969.json | 25.0 | missing | missing | missing | |
| 3142 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_170100__818 | 1 | 0.0 | 66.6179 | 1 | [313, 280] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_170100__818.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3143 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_170202__346 | 0 | 0.0 | 61.0314 | 0 | [313, 415] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_170202__346.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3144 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_234646__682 | 0 | 0.0 | 97.3872 | 0 | [313, 516] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231226_234646__682.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3145 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_004536__187 | 1 | 0.0 | 19.6626 | 1 | [11, 525] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_004536__187.json | 60.0 | missing | missing | missing | |
| 3146 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_171229__908 | 0 | 0.0 | 85.7912 | 0 | [436, 568] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_171229__908.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3147 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_171238__680 | 0 | 0.0 | 9.69164 | 0 | [436, 4] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_171238__680.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3148 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_235029__204 | 0 | 0.0 | 16.7411 | 0 | [436, 58] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231226_235029__204.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3149 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_004516__993 | 0 | 0.0 | 26.3958 | 0 | [424, 597] | 0.4.0 | 5 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_004516__993.json | 0.0 | missing | missing | missing | |
| 3150 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_170951__786 | 0 | 0.0 | 102.185 | 0 | [434, 684] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_170951__786.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3151 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_171103__565 | 0 | 0.0 | 71.4798 | 0 | [434, 465] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_171103__565.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3152 | Apple-MacBook-Pro-M1 | event_scheduler | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_235012__991 | 0 | 0.0 | 63.985 | 0 | [434, 410] | 0.6.0 | 5 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/event_scheduler/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231226_235012__991.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3153 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231214_010913__674 | 0 | 0.0 | 67.7251 | 0 | [89, 548] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_010913__674.json | 50.0 | missing | missing | missing | |
| 3154 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_142511__732 | 0 | 0.0 | 23.315 | 1 | [97, 423] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_142511__732.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3155 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231225_142517__950 | 0 | 0.0 | 6.3899 | 0 | [97, 104] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_142517__950.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3156 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231227_002344__284 | 2 | 0.0 | 5.80949 | 3 | [97, 92] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_002344__284.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3157 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_010805__240 | 0 | 0.0 | 47.798 | 0 | [118, 377] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_010805__240.json | 25.0 | missing | missing | missing | |
| 3158 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_142442__217 | 0 | 0.0 | 14.8807 | 0 | [135, 260] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_142442__217.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3159 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_142447__656 | 0 | 0.0 | 5.02571 | 0 | [135, 72] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_142447__656.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3160 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_002339__118 | 3 | 0.0 | 6.11357 | 3 | [135, 93] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_002339__118.json | 83.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3161 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_010717__587 | 0 | 0.0 | 65.1538 | 0 | [211, 509] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_010717__587.json | 50.0 | missing | missing | missing | |
| 3162 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_142413__167 | 0 | 0.0 | 25.2016 | 3 | [229, 253] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142413__167.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3163 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_142427__660 | 1 | 0.0 | 14.5692 | 1 | [229, 236] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142427__660.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3164 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_002332__865 | 0 | 0.0 | 24.8174 | 3 | [229, 252] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002332__865.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3165 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_011155__641 | 0 | 0.0 | 60.4912 | 0 | [11, 467] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_011155__641.json | 50.0 | missing | missing | missing | |
| 3166 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_142631__769 | 2 | 0.0 | 16.5486 | 2 | [400, 242] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142631__769.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3167 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_142649__487 | 2 | 0.0 | 18.0384 | 3 | [400, 269] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142649__487.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3168 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_002426__776 | 0 | 0.0 | 16.97 | 0 | [400, 245] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_002426__776.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3169 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_011055__712 | 0 | 0.0 | 64.1534 | 0 | [389, 404] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_011055__712.json | 0.0 | missing | missing | missing | |
| 3170 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_142558__838 | 1 | 0.0 | 22.2302 | 2 | [397, 344] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_142558__838.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3171 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_142614__308 | 0 | 0.0 | 16.5594 | 0 | [397, 243] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_142614__308.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3172 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_002408__254 | 0 | 0.0 | 23.1266 | 0 | [397, 358] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_002408__254.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3173 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | InJulia | 1SHOT | true | true | 5 | 20231214_011400__819 | 0 | 0.0 | 53.6548 | 0 | [89, 451] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_011400__819.json | 50.0 | missing | missing | missing | |
| 3174 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_142804__566 | 0 | 0.0 | 1.56477 | 0 | [71, 15] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_142804__566.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3175 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | InJulia | 1SHOT | true | true | 5 | 20231225_142813__391 | 0 | 0.0 | 8.78628 | 0 | [71, 155] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_142813__391.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3176 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_011307__959 | 0 | 0.0 | 20.157 | 0 | [118, 167] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_011307__959.json | 0.0 | missing | missing | missing | |
| 3177 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_142740__622 | 0 | 0.0 | 27.433 | 0 | [72, 504] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_142740__622.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3178 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_142802__158 | 0 | 0.0 | 21.6767 | 0 | [72, 399] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_142802__158.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3179 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_011246__430 | 0 | 0.0 | 51.0405 | 0 | [211, 406] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_011246__430.json | 50.0 | missing | missing | missing | |
| 3180 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_142705__962 | 0 | 0.0 | 15.5559 | 0 | [104, 88] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142705__962.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3181 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_142713__194 | 0 | 0.0 | 7.95893 | 0 | [104, 134] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142713__194.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3182 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_072324__712 | 0 | 0.0 | 10.3802 | 0 | [11, 61] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_072324__712.json | 0.0 | missing | missing | missing | |
| 3183 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_142911__223 | 0 | 0.0 | 12.4744 | 0 | [89, 225] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142911__223.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3184 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_142912__463 | 0 | 0.0 | 1.37346 | 0 | [89, 11] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142912__463.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3185 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_072313__145 | 0 | 0.0 | 109.114 | 0 | [389, 538] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_072313__145.json | 0.0 | missing | missing | missing | |
| 3186 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_142841__175 | 0 | 0.0 | 9.80212 | 0 | [86, 174] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_142841__175.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3187 | Apple-MacBook-Pro-M1 | extract_julia_code | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_142858__692 | 0 | 0.0 | 16.6349 | 0 | [86, 304] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_142858__692.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3188 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_234655__125 | 0 | 0.0 | 10.6874 | 0 | [1, 334] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_234655__125.json | 25.0 | missing | missing | missing | |
| 3189 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_234708__156 | 0 | 0.0 | 12.0575 | 0 | [1, 374] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231219_234708__156.json | 25.0 | missing | missing | missing | |
| 3190 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_144900__231 | 2 | 0.0 | 45.4329 | 2 | [90, 273] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_144900__231.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3191 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_144932__862 | 2 | 0.0 | 30.7255 | 2 | [90, 177] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_144932__862.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3192 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_003537__647 | 0 | 0.0 | 43.382 | 3 | [90, 257] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_003537__647.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3193 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_234623__114 | 0 | 0.0 | 2.24022 | 0 | [1, 72] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_234623__114.json | 25.0 | missing | missing | missing | |
| 3194 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_234632__484 | 0 | 0.0 | 8.7994 | 0 | [1, 274] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_234632__484.json | 25.0 | missing | missing | missing | |
| 3195 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_144724__475 | 2 | 0.0 | 71.5143 | 2 | [131, 424] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_144724__475.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3196 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_144815__751 | 0 | 0.0 | 49.8838 | 0 | [131, 291] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_144815__751.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3197 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_003453__641 | 0 | 0.0 | 140.334 | 0 | [131, 810] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_003453__641.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3198 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_234607__696 | 0 | 0.0 | 4.35872 | 0 | [1, 134] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234607__696.json | 0.0 | missing | missing | missing | |
| 3199 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_234613__536 | 0 | 0.0 | 5.98466 | 0 | [1, 183] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234613__536.json | 0.0 | missing | missing | missing | |
| 3200 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_144536__795 | 0 | 0.0 | 57.0753 | 4 | [224, 161] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_144536__795.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3201 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_144612__855 | 1 | 0.0 | 36.0845 | 3 | [224, 193] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_144612__855.json | 73.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3202 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_003233__648 | 0 | 0.0 | 44.8647 | 0 | [224, 97] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_003233__648.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3203 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_234852__274 | 0 | 0.0 | 20.4252 | 0 | [1, 557] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_234852__274.json | 0.0 | missing | missing | missing | |
| 3204 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_234914__491 | 0 | 0.0 | 22.0867 | 0 | [1, 598] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_234914__491.json | 25.0 | missing | missing | missing | |
| 3205 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_145305__691 | 1 | 0.0 | 74.0036 | 4 | [419, 385] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145305__691.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3206 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_145349__154 | 0 | 0.0 | 43.5632 | 1 | [419, 202] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145349__154.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3207 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_003731__364 | 2 | 0.0 | 65.6117 | 4 | [419, 329] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_003731__364.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3208 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_234817__538 | 0 | 0.0 | 1.22261 | 0 | [1, 36] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_234817__538.json | 0.0 | missing | missing | missing | |
| 3209 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_234829__105 | 0 | 0.0 | 11.9802 | 0 | [1, 338] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_234829__105.json | 25.0 | missing | missing | missing | |
| 3210 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_145129__673 | 5 | 0.0 | 33.2935 | 4 | [417, 140] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145129__673.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3211 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_145150__892 | 2 | 0.0 | 20.8068 | 3 | [417, 63] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145150__892.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3212 | Apple-MacBook-Pro-M1 | extract_julia_code | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_003625__313 | 2 | 0.0 | 47.3912 | 4 | [417, 225] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_003625__313.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3213 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_004318__971 | 0 | 0.0 | 12.9747 | 0 | [89, 494] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_004318__971.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3214 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_114514__282 | 0 | 0.0 | 10.1774 | 0 | [89, 390] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_114514__282.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3215 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_114528__209 | 0 | 0.0 | 14.3481 | 0 | [89, 544] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_114528__209.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3216 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_114541__757 | 0 | 0.0 | 12.6967 | 0 | [89, 484] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_114541__757.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3217 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_004305__873 | 0 | 0.0 | 3.26013 | 0 | [126, 117] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_004305__873.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3218 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_114450__993 | 0 | 0.0 | 5.01629 | 0 | [126, 186] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_114450__993.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3219 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_114456__508 | 0 | 0.0 | 5.93223 | 0 | [126, 222] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_114456__508.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3220 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_114503__288 | 0 | 0.0 | 7.12582 | 0 | [126, 268] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_114503__288.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3221 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_004302__549 | 0 | 0.0 | 5.46064 | 0 | [217, 58] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004302__549.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3222 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_114437__146 | 0 | 0.0 | 7.838 | 0 | [217, 146] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114437__146.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3223 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_114442__482 | 0 | 0.0 | 5.42468 | 0 | [217, 187] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114442__482.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3224 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_114445__882 | 0 | 0.0 | 3.19949 | 0 | [217, 101] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114445__882.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3225 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_004331__116 | 0 | 0.0 | 7.03182 | 0 | [378, 219] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004331__116.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3226 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_114621__290 | 0 | 0.0 | 8.397 | 0 | [378, 269] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114621__290.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3227 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_114628__936 | 0 | 0.0 | 6.57376 | 0 | [378, 202] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114628__936.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3228 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_114638__374 | 0 | 0.0 | 9.70475 | 0 | [378, 316] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_114638__374.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3229 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_004324__750 | 0 | 0.0 | 5.65424 | 0 | [375, 168] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_004324__750.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3230 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_114551__418 | 0 | 0.0 | 9.9777 | 0 | [375, 326] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_114551__418.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3231 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_114600__157 | 0 | 0.0 | 9.19691 | 0 | [375, 298] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_114600__157.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3232 | Apple-MacBook-Pro-M1 | extract_julia_code | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_114613__598 | 0 | 0.0 | 12.9481 | 0 | [375, 432] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_114613__598.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3233 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | InJulia | 1SHOT | true | false | 5 | 20231214_005750__948 | 0 | 0.0 | 16.2995 | 0 | [89, 476] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__InJulia__1SHOT__20231214_005750__948.json | 25.0 | missing | missing | missing | |
| 3234 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_140722__171 | 0 | 0.0 | 18.3163 | 0 | [89, 536] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__InJulia__1SHOT__20231225_140722__171.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3235 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_140735__895 | 0 | 0.0 | 12.7006 | 0 | [1, 393] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__InJulia__1SHOT__20231225_140735__895.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3236 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | InJulia | 1SHOT | true | false | 5 | 20231227_001712__113 | 0 | 0.0 | 20.6041 | 0 | [89, 605] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__InJulia__1SHOT__20231227_001712__113.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3237 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_005733__282 | 0 | 0.0 | 3.28608 | 0 | [118, 78] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_005733__282.json | 50.0 | missing | missing | missing | |
| 3238 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_140659__610 | 0 | 0.0 | 10.5639 | 0 | [118, 303] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_140659__610.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3239 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_140703__499 | 0 | 0.0 | 3.98842 | 0 | [1, 127] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_140703__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3240 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_001651__327 | 0 | 0.0 | 10.0247 | 0 | [118, 290] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_001651__327.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3241 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_005730__644 | 0 | 0.0 | 12.9024 | 0 | [211, 336] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005730__644.json | 25.0 | missing | missing | missing | |
| 3242 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_140639__126 | 1 | 0.0 | 28.2002 | 1 | [229, 611] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140639__126.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3243 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_140649__602 | 0 | 0.0 | 9.80533 | 0 | [1, 294] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140649__602.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3244 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_001641__340 | 0 | 0.0 | 14.7387 | 0 | [229, 250] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001641__340.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3245 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_005836__474 | 0 | 0.0 | 23.1775 | 0 | [11, 616] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_005836__474.json | 25.0 | missing | missing | missing | |
| 3246 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_140847__663 | 0 | 0.0 | 15.2945 | 0 | [11, 420] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140847__663.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3247 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_140849__103 | 0 | 0.0 | 1.97269 | 0 | [1, 57] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140849__103.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3248 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_001746__971 | 0 | 0.0 | 21.914 | 0 | [11, 595] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001746__971.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3249 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_005813__246 | 0 | 0.0 | 13.1495 | 0 | [389, 270] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_005813__246.json | 0.0 | missing | missing | missing | |
| 3250 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_140802__171 | 0 | 0.0 | 4.85101 | 0 | [389, 36] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_140802__171.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3251 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_140831__179 | 0 | 0.0 | 29.3798 | 0 | [1, 776] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_140831__179.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3252 | Apple-MacBook-Pro-M1 | extract_julia_code | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_001724__134 | 0 | 0.0 | 12.2519 | 0 | [389, 250] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_001724__134.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3253 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | InJulia | 1SHOT | true | true | 5 | 20231214_072654__478 | 0 | 0.0 | 48.6911 | 0 | [89, 373] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__InJulia__1SHOT__20231214_072654__478.json | 50.0 | missing | missing | missing | |
| 3254 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_143003__366 | 2 | 0.0 | 9.81618 | 2 | [89, 323] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__InJulia__1SHOT__20231225_143003__366.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3255 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_143012__939 | 2 | 0.0 | 9.08471 | 4 | [89, 297] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__InJulia__1SHOT__20231225_143012__939.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3256 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | InJulia | 1SHOT | true | true | 5 | 20231227_002458__617 | 2 | 0.0 | 8.70972 | 4 | [89, 283] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__InJulia__1SHOT__20231227_002458__617.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3257 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_072605__949 | 0 | 0.0 | 55.1543 | 0 | [118, 420] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_072605__949.json | 0.0 | missing | missing | missing | |
| 3258 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_142942__712 | 0 | 0.0 | 7.41092 | 0 | [128, 236] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_142942__712.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3259 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_142953__702 | 0 | 0.0 | 10.3598 | 0 | [128, 334] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_142953__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3260 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_002450__715 | 0 | 0.0 | 9.90197 | 0 | [128, 317] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_002450__715.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3261 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_072510__551 | 1 | 0.0 | 106.253 | 2 | [211, 622] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_072510__551.json | 67.5 | missing | missing | missing | |
| 3262 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_142928__154 | 0 | 0.0 | 16.0749 | 0 | [221, 308] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142928__154.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3263 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_142935__874 | 0 | 0.0 | 6.70834 | 0 | [221, 194] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_142935__874.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3264 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_002440__348 | 0 | 0.0 | 13.5333 | 0 | [221, 229] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002440__348.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3265 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_072804__461 | 0 | 0.0 | 15.915 | 0 | [11, 435] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_072804__461.json | 50.0 | missing | missing | missing | |
| 3266 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_143103__727 | 1 | 0.0 | 5.50103 | 1 | [392, 124] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143103__727.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3267 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_143118__282 | 2 | 0.0 | 14.4601 | 3 | [392, 410] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143118__282.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3268 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_002530__770 | 0 | 0.0 | 13.837 | 0 | [392, 388] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_002530__770.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3269 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_072748__801 | 0 | 0.0 | 30.4187 | 0 | [389, 712] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_072748__801.json | 50.0 | missing | missing | missing | |
| 3270 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_143042__705 | 2 | 0.0 | 12.5475 | 2 | [389, 351] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_143042__705.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3271 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_143057__177 | 0 | 0.0 | 15.3 | 0 | [389, 436] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_143057__177.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3272 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_002516__849 | 2 | 0.0 | 16.8136 | 4 | [389, 480] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_002516__849.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3273 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_180952__507 | 0 | 0.0 | 17.411 | 0 | [89, 337] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180952__507.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3274 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_180957__736 | 0 | 0.0 | 5.67614 | 0 | [89, 104] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_180957__736.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3275 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_181020__889 | 2 | 0.0 | 23.106 | 3 | [89, 448] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181020__889.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3276 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_180855__299 | 0 | 0.0 | 11.6991 | 0 | [128, 220] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180855__299.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3277 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_180916__174 | 0 | 0.0 | 20.7913 | 0 | [128, 398] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180916__174.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3278 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_180934__742 | 0 | 0.0 | 18.2323 | 0 | [128, 349] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_180934__742.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3279 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180814__324 | 0 | 0.0 | 11.9533 | 0 | [221, 213] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180814__324.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3280 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180830__508 | 0 | 0.0 | 15.6698 | 2 | [221, 286] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180830__508.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3281 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_180843__625 | 0 | 0.0 | 12.8595 | 0 | [221, 231] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_180843__625.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3282 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181138__945 | 0 | 0.0 | 15.7593 | 0 | [392, 264] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181138__945.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3283 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181156__689 | 5 | 0.0 | 17.5477 | 3 | [392, 298] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181156__689.json | 93.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3284 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181207__472 | 1 | 0.0 | 10.723 | 2 | [392, 167] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181207__472.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3285 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181040__752 | 2 | 0.0 | 19.6361 | 3 | [389, 338] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181040__752.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3286 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181057__827 | 0 | 0.0 | 16.9697 | 0 | [389, 287] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181057__827.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3287 | Apple-MacBook-Pro-M1 | extract_julia_code | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181122__285 | 4 | 0.0 | 24.2607 | 3 | [389, 425] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181122__285.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3288 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231220_073949__970 | 0 | 0.0 | 15.8077 | 0 | [1, 482] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231220_073949__970.json | 25.0 | missing | missing | missing | |
| 3289 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231220_074000__680 | 0 | 0.0 | 10.2106 | 0 | [1, 320] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231220_074000__680.json | 25.0 | missing | missing | missing | |
| 3290 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_145800__767 | 0 | 0.0 | 7.36794 | 0 | [88, 181] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_145800__767.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3291 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_145808__476 | 0 | 0.0 | 8.10971 | 0 | [88, 200] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_145808__476.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3292 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_003945__720 | 0 | 0.0 | 6.87629 | 0 | [88, 167] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_003945__720.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3293 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_235349__824 | 0 | 0.0 | 12.2304 | 0 | [118, 351] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_235349__824.json | 25.0 | missing | missing | missing | |
| 3294 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231220_073915__951 | 0 | 0.0 | 10.6741 | 0 | [135, 299] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231220_073915__951.json | 25.0 | missing | missing | missing | |
| 3295 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_145745__910 | 0 | 0.0 | 7.63909 | 0 | [129, 179] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_145745__910.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3296 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_145753__442 | 0 | 0.0 | 7.44825 | 0 | [129, 174] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_145753__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3297 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_003938__574 | 0 | 0.0 | 12.2066 | 0 | [129, 295] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_003938__574.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3298 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_235325__758 | 0 | 0.0 | 5.66995 | 0 | [1, 174] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_235325__758.json | 25.0 | missing | missing | missing | |
| 3299 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_235337__183 | 0 | 0.0 | 12.2291 | 0 | [1, 363] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_235337__183.json | 25.0 | missing | missing | missing | |
| 3300 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_145732__575 | 0 | 0.0 | 11.3329 | 0 | [223, 119] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_145732__575.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3301 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_145738__703 | 0 | 0.0 | 6.17309 | 0 | [223, 130] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_145738__703.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3302 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_003925__220 | 1 | 0.0 | 21.1589 | 1 | [223, 374] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_003925__220.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3303 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_145911__482 | 2 | 0.0 | 14.9973 | 2 | [396, 325] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145911__482.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3304 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_145928__621 | 0 | 0.0 | 16.9559 | 3 | [396, 373] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145928__621.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3305 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_003959__637 | 0 | 0.0 | 6.15009 | 0 | [396, 102] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_003959__637.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3306 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_145837__693 | 0 | 0.0 | 9.08408 | 0 | [394, 177] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145837__693.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3307 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_145856__641 | 0 | 0.0 | 18.6276 | 0 | [394, 414] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145856__641.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3308 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_003953__874 | 0 | 0.0 | 8.20422 | 0 | [394, 154] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_003953__874.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3309 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231227_234122__825 | 0 | 0.0 | 20.1359 | 0 | [87, 635] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_234122__825.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3310 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_234133__904 | 2 | 0.0 | 10.883 | 4 | [87, 343] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_234133__904.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3311 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_234141__167 | 0 | 0.0 | 8.16404 | 0 | [87, 255] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_234141__167.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3312 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_234151__399 | 0 | 0.0 | 9.45179 | 0 | [87, 297] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_234151__399.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3313 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231227_234201__726 | 1 | 0.0 | 9.47905 | 4 | [87, 298] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_234201__726.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3314 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234038__996 | 0 | 0.0 | 3.15459 | 0 | [128, 85] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_234038__996.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3315 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_234042__148 | 0 | 0.0 | 4.38565 | 0 | [128, 126] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_234042__148.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3316 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234050__630 | 0 | 0.0 | 7.93253 | 0 | [128, 242] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_234050__630.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3317 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234055__922 | 0 | 0.0 | 4.985 | 0 | [128, 146] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_234055__922.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3318 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234102__801 | 0 | 0.0 | 6.53412 | 0 | [128, 196] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_234102__801.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3319 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_233948__525 | 0 | 0.0 | 13.1358 | 0 | [222, 365] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_233948__525.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3320 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234004__843 | 0 | 0.0 | 16.6248 | 0 | [222, 498] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234004__843.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3321 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234015__353 | 0 | 0.0 | 10.5963 | 0 | [222, 310] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234015__353.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3322 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234025__467 | 2 | 0.0 | 10.1018 | 4 | [222, 294] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234025__467.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3323 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_234035__255 | 0 | 0.0 | 9.55406 | 0 | [222, 277] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234035__255.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3324 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234323__920 | 0 | 0.0 | 10.6055 | 0 | [395, 277] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234323__920.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3325 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234336__768 | 0 | 0.0 | 12.9084 | 0 | [395, 348] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234336__768.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3326 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234357__971 | 0 | 0.0 | 21.0842 | 0 | [395, 594] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234357__971.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3327 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234414__438 | 0 | 0.0 | 17.0635 | 0 | [395, 474] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234414__438.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3328 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234427__705 | 0 | 0.0 | 12.434 | 0 | [395, 333] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234427__705.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3329 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234213__156 | 0 | 0.0 | 12.0336 | 0 | [393, 321] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_234213__156.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3330 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234226__129 | 0 | 0.0 | 12.6612 | 0 | [393, 340] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_234226__129.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3331 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234245__182 | 0 | 0.0 | 19.0305 | 0 | [393, 533] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_234245__182.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3332 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234301__282 | 0 | 0.0 | 15.3566 | 0 | [393, 422] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_234301__282.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3333 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_234312__552 | 0 | 0.0 | 11.0397 | 0 | [393, 290] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_234312__552.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3334 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_234623__545 | 0 | 0.0 | 10.3962 | 0 | [87, 257] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_234623__545.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3335 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_234631__556 | 0 | 0.0 | 8.71872 | 0 | [87, 214] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_234631__556.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3336 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_234644__542 | 0 | 0.0 | 12.1947 | 0 | [87, 303] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_234644__542.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3337 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_234649__528 | 0 | 0.0 | 4.90084 | 0 | [87, 115] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_234649__528.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3338 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_234658__186 | 0 | 0.0 | 9.4983 | 1 | [87, 234] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_234658__186.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3339 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234530__786 | 0 | 0.0 | 5.98935 | 0 | [128, 138] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_234530__786.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3340 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_234540__807 | 0 | 0.0 | 10.2836 | 2 | [128, 249] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_234540__807.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3341 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_234554__508 | 0 | 0.0 | 13.6842 | 4 | [128, 335] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_234554__508.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3342 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_234600__926 | 0 | 0.0 | 4.79094 | 0 | [128, 107] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_234600__926.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3343 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_234612__545 | 0 | 0.0 | 12.4795 | 0 | [128, 305] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_234612__545.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3344 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234442__126 | 0 | 0.0 | 14.9855 | 0 | [222, 331] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234442__126.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3345 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234455__403 | 0 | 0.0 | 12.9721 | 0 | [222, 302] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234455__403.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3346 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234506__992 | 2 | 0.0 | 10.7989 | 3 | [222, 247] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234506__992.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3347 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_234516__279 | 0 | 0.0 | 9.73969 | 0 | [222, 220] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234516__279.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3348 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_234524__183 | 0 | 0.0 | 8.31752 | 0 | [222, 184] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234524__183.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3349 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234843__788 | 0 | 0.0 | 17.9292 | 0 | [395, 393] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234843__788.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3350 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234857__729 | 0 | 0.0 | 13.6827 | 0 | [395, 289] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234857__729.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3351 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_234909__682 | 0 | 0.0 | 12.5117 | 0 | [395, 260] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234909__682.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3352 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_234924__245 | 0 | 0.0 | 14.9846 | 0 | [395, 321] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234924__245.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3353 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_234937__723 | 0 | 0.0 | 12.381 | 0 | [395, 257] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_234937__723.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3354 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234717__219 | 0 | 0.0 | 18.0223 | 0 | [393, 395] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_234717__219.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3355 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234729__225 | 1 | 0.0 | 11.8805 | 4 | [393, 244] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_234729__225.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3356 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234746__691 | 2 | 0.0 | 16.9101 | 2 | [393, 368] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_234746__691.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3357 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234803__578 | 0 | 0.0 | 17.391 | 0 | [393, 380] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_234803__578.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3358 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_234825__500 | 2 | 0.0 | 21.3066 | 3 | [393, 475] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_234825__500.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3359 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_122042__263 | 0 | 0.0 | 13.447 | 0 | [87, 245] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122042__263.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3360 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_122055__584 | 0 | 0.0 | 12.8496 | 0 | [87, 234] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122055__584.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3361 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_004215__438 | 0 | 0.0 | 10.3957 | 0 | [87, 187] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_004215__438.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3362 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_122008__781 | 0 | 0.0 | 6.7766 | 0 | [128, 115] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122008__781.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3363 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_122028__928 | 0 | 0.0 | 20.7328 | 0 | [128, 377] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122028__928.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3364 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_004205__462 | 0 | 0.0 | 8.8331 | 0 | [128, 154] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_004205__462.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3365 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231226_121945__298 | 0 | 0.0 | 5.79401 | 0 | [222, 86] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_121945__298.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3366 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_122001__677 | 0 | 0.0 | 15.6676 | 0 | [222, 271] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_122001__677.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3367 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_004156__687 | 0 | 0.0 | 30.2944 | 0 | [222, 375] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004156__687.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3368 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_122255__284 | 0 | 0.0 | 24.0008 | 0 | [395, 400] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122255__284.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3369 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_122318__589 | 0 | 0.0 | 22.8435 | 2 | [395, 379] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122318__589.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3370 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_004256__336 | 0 | 0.0 | 25.3983 | 0 | [395, 424] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004256__336.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3371 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_122152__868 | 0 | 0.0 | 31.5719 | 0 | [393, 536] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122152__868.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3372 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_122231__148 | 1 | 0.0 | 38.8991 | 3 | [393, 666] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122231__148.json | 73.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3373 | Apple-MacBook-Pro-M1 | extract_julia_code | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_004231__709 | 0 | 0.0 | 15.6813 | 0 | [393, 248] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_004231__709.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3374 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_115432__612 | 0 | 0.0 | 133.053 | 0 | [88, 785] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_115432__612.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3375 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_115550__741 | 0 | 0.0 | 77.9773 | 4 | [88, 458] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_115550__741.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3376 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_115639__515 | 0 | 0.0 | 48.5566 | 4 | [88, 283] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_115639__515.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3377 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_151747__297 | 0 | 0.0 | 65.2108 | 0 | [88, 384] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_151747__297.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3378 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_151843__276 | 2 | 0.0 | 55.8891 | 2 | [88, 328] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_151843__276.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3379 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_115025__950 | 0 | 0.0 | 34.5608 | 0 | [127, 194] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_115025__950.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3380 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_115109__234 | 0 | 0.0 | 43.4343 | 0 | [127, 248] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_115109__234.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3381 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_115219__441 | 0 | 0.0 | 69.8194 | 0 | [127, 407] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_115219__441.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3382 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_151532__170 | 0 | 0.0 | 44.4433 | 0 | [127, 253] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_151532__170.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3383 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_151642__112 | 0 | 0.0 | 69.9384 | 0 | [127, 406] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_151642__112.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3384 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_114746__887 | 2 | 0.0 | 68.3729 | 3 | [220, 355] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114746__887.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3385 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_114844__262 | 2 | 0.0 | 57.2817 | 1 | [220, 315] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114844__262.json | 66.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3386 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_114951__456 | 0 | 0.0 | 66.6715 | 0 | [220, 371] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_114951__456.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3387 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_151351__734 | 0 | 0.0 | 38.4833 | 0 | [220, 201] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_151351__734.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3388 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_151447__311 | 2 | 0.0 | 56.4804 | 3 | [220, 309] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_151447__311.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3389 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_120159__737 | 2 | 0.0 | 77.1404 | 4 | [401, 396] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120159__737.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3390 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_120354__643 | 0 | 0.0 | 114.77 | 0 | [401, 611] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120354__643.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3391 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_120624__767 | 0 | 0.0 | 149.229 | 0 | [401, 784] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120624__767.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3392 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_152141__601 | 0 | 0.0 | 42.4428 | 4 | [401, 191] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_152141__601.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3393 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_152325__387 | 0 | 0.0 | 102.984 | 0 | [401, 543] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_152325__387.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3394 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_115749__856 | 0 | 0.0 | 70.3888 | 0 | [399, 356] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_115749__856.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3395 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_115926__770 | 0 | 0.0 | 96.4095 | 0 | [399, 506] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_115926__770.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3396 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_120041__760 | 0 | 0.0 | 74.756 | 4 | [399, 377] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_120041__760.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3397 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_151945__894 | 0 | 0.0 | 61.1968 | 0 | [399, 301] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_151945__894.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3398 | Apple-MacBook-Pro-M1 | extract_julia_code | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_152059__108 | 0 | 0.0 | 73.8718 | 0 | [399, 374] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_152059__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3399 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_150030__863 | 0 | 0.0 | 9.31086 | 0 | [96, 231] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_150030__863.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3400 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_150036__828 | 2 | 0.0 | 5.83328 | 2 | [96, 140] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_150036__828.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3401 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_004032__562 | 0 | 0.0 | 6.92805 | 0 | [96, 168] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_004032__562.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3402 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_150013__538 | 2 | 0.0 | 6.00908 | 2 | [137, 136] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_150013__538.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3403 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_150021__170 | 0 | 0.0 | 7.36175 | 0 | [137, 171] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_150021__170.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3404 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_004025__651 | 0 | 0.0 | 3.34153 | 3 | [137, 66] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_004025__651.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3405 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_145951__411 | 2 | 0.0 | 22.7207 | 3 | [231, 386] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_145951__411.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3406 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_150007__576 | 0 | 0.0 | 16.1204 | 0 | [231, 379] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150007__576.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3407 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_004021__953 | 0 | 0.0 | 22.2407 | 0 | [231, 382] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004021__953.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3408 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_150152__868 | 2 | 0.0 | 31.6362 | 3 | [404, 725] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150152__868.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3409 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_150208__853 | 0 | 0.0 | 15.7094 | 0 | [404, 341] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150208__853.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3410 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_004049__355 | 1 | 0.0 | 9.98944 | 3 | [404, 198] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004049__355.json | 73.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3411 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_150109__871 | 2 | 0.0 | 13.9512 | 2 | [402, 298] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_150109__871.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3412 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_150120__697 | 0 | 0.0 | 11.1489 | 0 | [402, 228] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_150120__697.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3413 | Apple-MacBook-Pro-M1 | extract_julia_code | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_004038__971 | 2 | 0.0 | 6.66467 | 3 | [402, 115] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_004038__971.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3414 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | InJulia | 1SHOT | false | false | 5 | 20231214_005933__225 | 0 | 0.0 | 14.7763 | 0 | [89, 433] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_005933__225.json | 0.0 | missing | missing | missing | |
| 3415 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_140931__362 | 1 | 0.0 | 5.40041 | 3 | [94, 168] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_140931__362.json | 73.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3416 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_140939__877 | 0 | 0.0 | 7.83858 | 0 | [94, 250] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_140939__877.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3417 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231227_001824__184 | 2 | 0.0 | 9.51902 | 4 | [94, 304] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_001824__184.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3418 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_005918__873 | 0 | 0.0 | 12.8675 | 0 | [118, 367] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_005918__873.json | 50.0 | missing | missing | missing | |
| 3419 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_140916__615 | 0 | 0.0 | 11.4063 | 0 | [135, 357] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_140916__615.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3420 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_140926__232 | 2 | 0.0 | 9.27027 | 4 | [135, 288] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_140926__232.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3421 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_001814__257 | 0 | 0.0 | 11.0747 | 0 | [135, 345] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_001814__257.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3422 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_005905__554 | 0 | 0.0 | 28.4755 | 2 | [211, 746] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_005905__554.json | 62.5 | missing | missing | missing | |
| 3423 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_140900__655 | 0 | 0.0 | 11.189 | 0 | [229, 163] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140900__655.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3424 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_140905__197 | 2 | 0.0 | 4.73552 | 4 | [229, 121] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140905__197.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3425 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_001803__284 | 0 | 0.0 | 17.1515 | 0 | [229, 363] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001803__284.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3426 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_010031__683 | 0 | 0.0 | 23.9321 | 0 | [11, 639] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_010031__683.json | 50.0 | missing | missing | missing | |
| 3427 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_141031__221 | 0 | 0.0 | 9.82892 | 0 | [402, 259] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_141031__221.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3428 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_141044__475 | 0 | 0.0 | 12.5201 | 0 | [402, 344] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_141044__475.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3429 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_001839__342 | 0 | 0.0 | 7.86678 | 0 | [402, 195] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001839__342.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3430 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_010007__876 | 0 | 0.0 | 15.3545 | 0 | [389, 327] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_010007__876.json | 0.0 | missing | missing | missing | |
| 3431 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_141013__403 | 0 | 0.0 | 16.2672 | 1 | [400, 460] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_141013__403.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3432 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_141021__982 | 2 | 0.0 | 7.59333 | 4 | [400, 187] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_141021__982.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3433 | Apple-MacBook-Pro-M1 | extract_julia_code | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_001831__101 | 0 | 0.0 | 7.15872 | 0 | [400, 172] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_001831__101.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3434 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231214_073019__506 | 0 | 0.0 | 13.5043 | 0 | [89, 397] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__InJulia__1SHOT__20231214_073019__506.json | 25.0 | missing | missing | missing | |
| 3435 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_143410__389 | 0 | 0.0 | 3.40421 | 0 | [92, 50] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__InJulia__1SHOT__20231225_143410__389.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3436 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_143413__588 | 0 | 0.0 | 2.51248 | 0 | [92, 33] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__InJulia__1SHOT__20231225_143413__588.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3437 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231227_002707__693 | 0 | 0.0 | 18.5531 | 0 | [92, 336] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__InJulia__1SHOT__20231227_002707__693.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3438 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_073006__637 | 0 | 0.0 | 6.61338 | 0 | [118, 182] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_073006__637.json | 50.0 | missing | missing | missing | |
| 3439 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_143356__391 | 0 | 0.0 | 16.9653 | 0 | [131, 298] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_143356__391.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3440 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_143407__659 | 0 | 0.0 | 10.5739 | 0 | [131, 178] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_143407__659.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3441 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_002649__798 | 0 | 0.0 | 13.0209 | 0 | [131, 223] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_002649__798.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3442 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_072959__333 | 0 | 0.0 | 19.7094 | 0 | [211, 522] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_072959__333.json | 50.0 | missing | missing | missing | |
| 3443 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_143331__938 | 0 | 0.0 | 26.7265 | 0 | [224, 292] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143331__938.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3444 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_143339__302 | 0 | 0.0 | 7.84411 | 0 | [224, 115] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143339__302.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3445 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_002636__147 | 0 | 0.0 | 22.7331 | 0 | [224, 225] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002636__147.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3446 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_073128__267 | 0 | 0.0 | 21.1597 | 0 | [11, 568] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073128__267.json | 50.0 | missing | missing | missing | |
| 3447 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_143640__346 | 0 | 0.0 | 8.02687 | 0 | [395, 87] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143640__346.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3448 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_143724__334 | 0 | 0.0 | 44.3876 | 0 | [395, 721] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143724__334.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3449 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_002740__972 | 0 | 0.0 | 8.04379 | 0 | [395, 87] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_002740__972.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3450 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_073107__586 | 0 | 0.0 | 29.9543 | 0 | [389, 700] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_073107__586.json | 0.0 | missing | missing | missing | |
| 3451 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_143552__987 | 0 | 0.0 | 63.4508 | 0 | [392, 1029] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_143552__987.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3452 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_143632__215 | 0 | 0.0 | 39.7004 | 0 | [392, 643] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_143632__215.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3453 | Apple-MacBook-Pro-M1 | extract_julia_code | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_002732__605 | 0 | 0.0 | 25.0342 | 0 | [392, 390] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_002732__605.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3454 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_150239__427 | 0 | 0.0 | 1.4493 | 0 | [83, 49] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_150239__427.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3455 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_150308__656 | 0 | 0.0 | 28.9237 | 0 | [83, 1058] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_150308__656.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3456 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_004118__760 | 0 | 0.0 | 21.3184 | 0 | [83, 793] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_004118__760.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3457 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_150234__967 | 0 | 0.0 | 17.3613 | 0 | [120, 651] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_150234__967.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3458 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_150237__409 | 0 | 0.0 | 3.14171 | 0 | [120, 113] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_150237__409.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3459 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_004057__396 | 0 | 0.0 | 3.34953 | 0 | [120, 121] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_004057__396.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3460 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_150212__122 | 0 | 0.0 | 4.73362 | 0 | [211, 16] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150212__122.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3461 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_150217__454 | 0 | 0.0 | 4.18541 | 0 | [211, 140] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150217__454.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3462 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_004053__297 | 0 | 0.0 | 4.66874 | 0 | [211, 15] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004053__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3463 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_150427__597 | 0 | 0.0 | 4.62901 | 0 | [372, 131] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150427__597.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3464 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_150445__534 | 0 | 0.0 | 17.3485 | 0 | [372, 589] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150445__534.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3465 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_004126__601 | 0 | 0.0 | 5.37821 | 0 | [372, 158] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004126__601.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3466 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_150406__690 | 0 | 0.0 | 7.04964 | 0 | [369, 222] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_150406__690.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3467 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_150423__570 | 0 | 0.0 | 16.191 | 0 | [369, 549] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_150423__570.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3468 | Apple-MacBook-Pro-M1 | extract_julia_code | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_004120__555 | 0 | 0.0 | 2.07208 | 0 | [369, 33] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_004120__555.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3469 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231214_073154__463 | 0 | 0.0 | 10.8179 | 0 | [89, 319] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_073154__463.json | 50.0 | missing | missing | missing | |
| 3470 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_144014__991 | 2 | 0.0 | 33.2064 | 3 | [100, 251] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_144014__991.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3471 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_144048__259 | 2 | 0.0 | 33.3743 | 3 | [100, 252] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_144048__259.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3472 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231227_003000__604 | 4 | 0.0 | 40.4034 | 3 | [100, 305] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_003000__604.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3473 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_073143__209 | 0 | 0.0 | 3.30541 | 0 | [118, 79] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_073143__209.json | 50.0 | missing | missing | missing | |
| 3474 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_143912__770 | 2 | 0.0 | 19.2435 | 3 | [139, 132] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_143912__770.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3475 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_143940__481 | 2 | 0.0 | 27.7433 | 3 | [139, 201] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_143940__481.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3476 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_002920__103 | 5 | 0.0 | 44.7649 | 3 | [139, 335] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_002920__103.json | 93.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3477 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_073140__178 | 0 | 0.0 | 11.3937 | 0 | [211, 293] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073140__178.json | 0.0 | missing | missing | missing | |
| 3478 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_143812__987 | 5 | 0.0 | 47.8522 | 3 | [232, 168] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143812__987.json | 93.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3479 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_143853__615 | 0 | 0.0 | 40.7208 | 0 | [232, 287] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143853__615.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3480 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_002835__257 | 5 | 0.0 | 54.502 | 3 | [232, 229] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002835__257.json | 93.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3481 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_073233__398 | 0 | 0.0 | 2.93157 | 0 | [11, 78] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073233__398.json | 0.0 | missing | missing | missing | |
| 3482 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_144412__445 | 2 | 0.0 | 41.893 | 3 | [403, 266] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_144412__445.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3483 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_144439__578 | 2 | 0.0 | 25.9237 | 3 | [403, 140] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_144439__578.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3484 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_003148__688 | 4 | 0.0 | 42.9906 | 3 | [403, 272] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_003148__688.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3485 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_073230__605 | 0 | 0.0 | 21.2264 | 0 | [389, 483] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_073230__605.json | 50.0 | missing | missing | missing | |
| 3486 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_144248__506 | 4 | 0.0 | 43.6109 | 3 | [400, 278] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_144248__506.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3487 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_144330__537 | 2 | 0.0 | 42.1953 | 2 | [400, 267] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_144330__537.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 3488 | Apple-MacBook-Pro-M1 | extract_julia_code | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_003105__207 | 5 | 0.0 | 63.6383 | 3 | [400, 430] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_003105__207.json | 93.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3489 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_235031__346 | 0 | 0.0 | 9.65595 | 0 | [1, 303] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_235031__346.json | 25.0 | missing | missing | missing | |
| 3490 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231219_235039__258 | 0 | 0.0 | 7.68142 | 0 | [1, 244] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231219_235039__258.json | 25.0 | missing | missing | missing | |
| 3491 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_145513__473 | 0 | 0.0 | 14.203 | 0 | [96, 238] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_145513__473.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3492 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_145530__520 | 0 | 0.0 | 15.9989 | 0 | [96, 269] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_145530__520.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3493 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_003826__820 | 0 | 0.0 | 17.485 | 0 | [96, 294] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_003826__820.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3494 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_235003__771 | 0 | 0.0 | 2.11132 | 0 | [1, 68] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_235003__771.json | 25.0 | missing | missing | missing | |
| 3495 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231219_235010__993 | 0 | 0.0 | 7.55718 | 0 | [1, 237] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231219_235010__993.json | 25.0 | missing | missing | missing | |
| 3496 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_145439__658 | 2 | 0.0 | 12.431 | 3 | [137, 198] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_145439__658.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3497 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_145459__964 | 0 | 0.0 | 19.9462 | 0 | [137, 328] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_145459__964.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3498 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_003808__452 | 2 | 0.0 | 14.662 | 1 | [137, 236] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_003808__452.json | 66.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3499 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231219_234940__384 | 0 | 0.0 | 15.2812 | 0 | [1, 447] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234940__384.json | 0.0 | missing | missing | missing | |
| 3500 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231219_234955__102 | 0 | 0.0 | 15.4767 | 0 | [1, 453] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231219_234955__102.json | 25.0 | missing | missing | missing | |
| 3501 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_145411__592 | 0 | 0.0 | 21.7987 | 0 | [231, 189] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_145411__592.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3502 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_145427__822 | 0 | 0.0 | 15.9172 | 0 | [231, 243] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_145427__822.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3503 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_003754__852 | 0 | 0.0 | 22.9093 | 0 | [231, 214] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_003754__852.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3504 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231219_235249__931 | 0 | 0.0 | 19.1812 | 0 | [1, 525] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_235249__931.json | 25.0 | missing | missing | missing | |
| 3505 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231219_235303__308 | 0 | 0.0 | 14.0671 | 0 | [1, 393] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231219_235303__308.json | 0.0 | missing | missing | missing | |
| 3506 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_145702__540 | 0 | 0.0 | 17.0227 | 0 | [404, 236] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145702__540.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3507 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_145720__158 | 0 | 0.0 | 18.0215 | 0 | [404, 253] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_145720__158.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3508 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_003904__261 | 0 | 0.0 | 15.7365 | 0 | [404, 214] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_003904__261.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3509 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231219_235158__209 | 0 | 0.0 | 14.6919 | 0 | [1, 410] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_235158__209.json | 25.0 | missing | missing | missing | |
| 3510 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231219_235211__192 | 0 | 0.0 | 13.6435 | 0 | [1, 382] | 0.5.0-DEV | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231219_235211__192.json | 0.0 | missing | missing | missing | |
| 3511 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_145631__986 | 0 | 0.0 | 20.1574 | 0 | [402, 289] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145631__986.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3512 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_145645__896 | 0 | 0.0 | 14.3253 | 0 | [402, 191] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_145645__896.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3513 | Apple-MacBook-Pro-M1 | extract_julia_code | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_003848__498 | 2 | 0.0 | 21.8503 | 4 | [402, 316] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_003848__498.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3514 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231214_072838__341 | 0 | 0.0 | 12.0489 | 0 | [89, 356] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_072838__341.json | 25.0 | missing | missing | missing | |
| 3515 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_143156__216 | 0 | 0.0 | 4.26577 | 0 | [96, 242] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_143156__216.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3516 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_143203__133 | 0 | 0.0 | 7.39367 | 0 | [96, 418] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_143203__133.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3517 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231227_002553__692 | 0 | 0.0 | 8.27899 | 0 | [96, 462] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_002553__692.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3518 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_072826__702 | 0 | 0.0 | 11.215 | 0 | [118, 320] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_072826__702.json | 0.0 | missing | missing | missing | |
| 3519 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_143146__337 | 0 | 0.0 | 7.47473 | 0 | [133, 411] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_143146__337.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3520 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_143151__766 | 0 | 0.0 | 5.36306 | 0 | [133, 292] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_143151__766.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3521 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_002544__558 | 0 | 0.0 | 6.10256 | 0 | [133, 332] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_002544__558.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3522 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_072815__284 | 0 | 0.0 | 10.9111 | 0 | [211, 279] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_072815__284.json | 0.0 | missing | missing | missing | |
| 3523 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_143131__468 | 0 | 0.0 | 13.4335 | 0 | [221, 551] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143131__468.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3524 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_143138__781 | 0 | 0.0 | 6.93661 | 0 | [221, 360] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_143138__781.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3525 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_002538__509 | 0 | 0.0 | 8.48354 | 0 | [221, 299] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002538__509.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3526 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_072939__333 | 0 | 0.0 | 15.8275 | 0 | [11, 432] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_072939__333.json | 50.0 | missing | missing | missing | |
| 3527 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_143259__101 | 0 | 0.0 | 2.47409 | 0 | [383, 78] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143259__101.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3528 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_143305__984 | 0 | 0.0 | 5.9577 | 0 | [383, 265] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_143305__984.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3529 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_002613__269 | 0 | 0.0 | 11.5303 | 0 | [383, 540] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_002613__269.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3530 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_072923__936 | 0 | 0.0 | 30.3348 | 0 | [389, 709] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_072923__936.json | 0.0 | missing | missing | missing | |
| 3531 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_143233__162 | 0 | 0.0 | 13.4047 | 0 | [381, 633] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_143233__162.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3532 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_143256__505 | 0 | 0.0 | 23.0824 | 0 | [381, 1064] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_143256__505.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3533 | Apple-MacBook-Pro-M1 | extract_julia_code | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_002602__160 | 0 | 0.0 | 8.99096 | 0 | [381, 417] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_002602__160.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3534 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231214_010108__132 | 0 | 0.0 | 11.5249 | 0 | [89, 340] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_010108__132.json | 25.0 | missing | missing | missing | |
| 3535 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_141151__703 | 2 | 0.0 | 16.0643 | 3 | [96, 519] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_141151__703.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3536 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_141204__298 | 2 | 0.0 | 12.7557 | 3 | [96, 411] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_141204__298.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3537 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_001917__857 | 2 | 0.0 | 11.0773 | 3 | [96, 354] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_001917__857.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3538 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_010056__237 | 0 | 0.0 | 10.9506 | 0 | [118, 312] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_010056__237.json | 50.0 | missing | missing | missing | |
| 3539 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_141127__714 | 0 | 0.0 | 8.59477 | 0 | [137, 262] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_141127__714.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3540 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_141135__460 | 2 | 0.0 | 7.99233 | 3 | [137, 244] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_141135__460.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3541 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_001906__246 | 2 | 0.0 | 5.88616 | 3 | [137, 174] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_001906__246.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3542 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_010045__955 | 0 | 0.0 | 13.9154 | 0 | [211, 362] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_010045__955.json | 0.0 | missing | missing | missing | |
| 3543 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_141107__385 | 0 | 0.0 | 23.548 | 0 | [231, 554] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_141107__385.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3544 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_141118__416 | 2 | 0.0 | 10.423 | 1 | [231, 306] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_141118__416.json | 66.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3545 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_001900__328 | 0 | 0.0 | 20.7668 | 0 | [231, 470] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_001900__328.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3546 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_010157__736 | 0 | 0.0 | 13.955 | 0 | [11, 383] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_010157__736.json | 25.0 | missing | missing | missing | |
| 3547 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_141313__683 | 0 | 0.0 | 9.01912 | 0 | [404, 232] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_141313__683.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3548 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_141331__952 | 0 | 0.0 | 18.2488 | 0 | [404, 519] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_141331__952.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3549 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_001935__140 | 2 | 0.0 | 8.55234 | 3 | [404, 216] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_001935__140.json | 78.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3550 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_010143__230 | 0 | 0.0 | 19.1573 | 0 | [389, 432] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_010143__230.json | 50.0 | missing | missing | missing | |
| 3551 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_141245__683 | 2 | 0.0 | 14.6131 | 4 | [402, 409] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_141245__683.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3552 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_141304__331 | 0 | 0.0 | 18.7693 | 0 | [402, 536] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_141304__331.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3553 | Apple-MacBook-Pro-M1 | extract_julia_code | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_001927__738 | 0 | 0.0 | 9.8246 | 3 | [402, 256] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_001927__738.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3554 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231214_010238__247 | 0 | 0.0 | 14.2107 | 0 | [89, 419] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_010238__247.json | 50.0 | missing | missing | missing | |
| 3555 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_141657__153 | 0 | 0.0 | 59.1145 | 4 | [88, 447] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_141657__153.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3556 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_141833__603 | 5 | 0.0 | 96.2979 | 4 | [88, 724] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_141833__603.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3557 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231227_002143__695 | 0 | 0.0 | 41.9031 | 3 | [88, 313] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_002143__695.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3558 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_010224__749 | 0 | 0.0 | 9.71578 | 0 | [118, 276] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_010224__749.json | 50.0 | missing | missing | missing | |
| 3559 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_141504__479 | 2 | 0.0 | 30.4412 | 4 | [127, 220] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_141504__479.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3560 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_141557__145 | 0 | 0.0 | 53.7142 | 4 | [127, 399] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_141557__145.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3561 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_002101__135 | 2 | 0.0 | 29.0221 | 4 | [127, 208] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_002101__135.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3562 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_010214__362 | 0 | 0.0 | 17.3004 | 0 | [211, 458] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_010214__362.json | 50.0 | missing | missing | missing | |
| 3563 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_141401__480 | 0 | 0.0 | 29.2389 | 0 | [220, 8] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_141401__480.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3564 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_141433__238 | 5 | 0.0 | 32.4772 | 3 | [220, 218] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_141433__238.json | 93.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3565 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_002032__816 | 0 | 0.0 | 55.9983 | 0 | [220, 226] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_002032__816.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3566 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_010612__558 | 0 | 0.0 | 120.555 | 0 | [11, 863] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_010612__558.json | 0.0 | missing | missing | missing | |
| 3567 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_142253__532 | 0 | 0.0 | 56.5415 | 0 | [401, 363] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142253__532.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3568 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_142347__639 | 0 | 0.0 | 54.5099 | 4 | [401, 348] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_142347__639.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3569 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_002307__790 | 0 | 0.0 | 28.5303 | 4 | [401, 153] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_002307__790.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3570 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_010411__873 | 2 | 0.0 | 65.0708 | 2 | [389, 444] | 0.4.0 | 4 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_010411__873.json | 72.5 | missing | missing | missing | |
| 3571 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_142102__321 | 2 | 0.0 | 47.8097 | 4 | [399, 298] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_142102__321.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3572 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_142156__895 | 1 | 0.0 | 53.9988 | 3 | [399, 344] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_142156__895.json | 73.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3573 | Apple-MacBook-Pro-M1 | extract_julia_code | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_002239__585 | 0 | 0.0 | 55.7434 | 0 | [399, 356] | 0.6.0 | 4 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/extract_julia_code/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_002239__585.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3574 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231214_074026__805 | 0 | 0.0 | 15.1538 | 0 | [113, 435] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_074026__805.json | 0.0 | missing | missing | missing | |
| 3575 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231225_152232__738 | 0 | 0.0 | 18.8136 | 0 | [121, 338] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_152232__738.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3576 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_152251__709 | 5 | 0.0 | 19.5704 | 4 | [121, 351] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_152251__709.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3577 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231227_005253__892 | 5 | 0.0 | 20.9975 | 4 | [121, 377] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_005253__892.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3578 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_074010__618 | 0 | 0.0 | 13.884 | 0 | [142, 388] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_074010__618.json | 25.0 | missing | missing | missing | |
| 3579 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_152210__505 | 0 | 0.0 | 5.04298 | 0 | [159, 72] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_152210__505.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3580 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_152213__919 | 0 | 0.0 | 2.85983 | 0 | [159, 30] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_152213__919.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3581 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_005232__402 | 4 | 0.0 | 15.4056 | 3 | [159, 267] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_005232__402.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3582 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_073957__302 | 0 | 0.0 | 18.4513 | 0 | [217, 486] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073957__302.json | 25.0 | missing | missing | missing | |
| 3583 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_152150__958 | 5 | 0.0 | 27.7109 | 4 | [235, 299] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152150__958.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3584 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_152205__274 | 5 | 0.0 | 15.1112 | 4 | [235, 246] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152205__274.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3585 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_005216__628 | 5 | 0.0 | 24.5138 | 4 | [235, 246] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_005216__628.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3586 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_074131__418 | 0 | 0.0 | 27.613 | 0 | [11, 720] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_074131__418.json | 25.0 | missing | missing | missing | |
| 3587 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_152412__935 | 5 | 0.0 | 23.975 | 4 | [424, 369] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152412__935.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3588 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_152423__708 | 0 | 0.0 | 10.8193 | 0 | [424, 134] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152423__708.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3589 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_005322__400 | 5 | 0.0 | 13.1647 | 4 | [424, 176] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_005322__400.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3590 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_074104__202 | 0 | 0.0 | 22.0255 | 0 | [413, 494] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_074104__202.json | 25.0 | missing | missing | missing | |
| 3591 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_152334__241 | 0 | 0.0 | 15.7836 | 0 | [421, 224] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_152334__241.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3592 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_152348__219 | 3 | 0.0 | 14.0877 | 4 | [421, 193] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_152348__219.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3593 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_005308__284 | 4 | 0.0 | 15.5077 | 3 | [421, 218] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_005308__284.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3594 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231214_074216__436 | 0 | 0.0 | 17.2876 | 0 | [113, 495] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_074216__436.json | 25.0 | missing | missing | missing | |
| 3595 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_152504__162 | 0 | 0.0 | 11.4887 | 0 | [95, 206] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_152504__162.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3596 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_152509__895 | 0 | 0.0 | 4.53152 | 0 | [95, 72] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_152509__895.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3597 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_074159__223 | 0 | 0.0 | 10.1801 | 0 | [142, 281] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_074159__223.json | 25.0 | missing | missing | missing | |
| 3598 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_152443__980 | 0 | 0.0 | 4.03724 | 0 | [96, 63] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_152443__980.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3599 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_152452__435 | 0 | 0.0 | 9.60543 | 0 | [96, 170] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_152452__435.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3600 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_074148__871 | 0 | 0.0 | 17.0336 | 0 | [217, 448] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_074148__871.json | 25.0 | missing | missing | missing | |
| 3601 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_152436__613 | 0 | 0.0 | 12.9216 | 0 | [110, 37] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152436__613.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3602 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_152439__370 | 0 | 0.0 | 2.53689 | 0 | [110, 29] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152439__370.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3603 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_074338__461 | 0 | 0.0 | 17.5613 | 0 | [11, 474] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_074338__461.json | 25.0 | missing | missing | missing | |
| 3604 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_152541__380 | 0 | 0.0 | 1.55749 | 0 | [113, 10] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152541__380.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3605 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_152547__590 | 0 | 0.0 | 6.15518 | 0 | [113, 99] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152547__590.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3606 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_074321__901 | 0 | 0.0 | 51.1592 | 0 | [413, 1166] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_074321__901.json | 25.0 | missing | missing | missing | |
| 3607 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_152533__493 | 0 | 0.0 | 6.04138 | 0 | [110, 97] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_152533__493.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3608 | Apple-MacBook-Pro-M1 | ispersonal | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_152539__660 | 0 | 0.0 | 6.127 | 0 | [110, 99] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_152539__660.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3609 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_154526__414 | 0 | 0.0 | 49.6007 | 0 | [111, 294] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_154526__414.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3610 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_154621__873 | 0 | 0.0 | 54.3269 | 0 | [111, 323] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_154621__873.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3611 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_010516__563 | 0 | 0.0 | 59.3721 | 0 | [111, 354] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_010516__563.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3612 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_154337__895 | 0 | 0.0 | 49.6022 | 0 | [152, 288] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_154337__895.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3613 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_154437__150 | 0 | 0.0 | 59.5843 | 0 | [152, 350] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_154437__150.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3614 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_010416__137 | 0 | 0.0 | 85.6099 | 0 | [152, 507] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_010416__137.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3615 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_154156__827 | 0 | 0.0 | 87.4402 | 0 | [226, 347] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_154156__827.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3616 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_154247__146 | 5 | 0.0 | 50.6823 | 4 | [226, 278] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_154247__146.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3617 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_010251__216 | 5 | 0.0 | 85.4935 | 4 | [226, 347] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_010251__216.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3618 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_155155__608 | 5 | 0.0 | 71.8722 | 4 | [440, 366] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155155__608.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3619 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_155254__306 | 0 | 0.0 | 59.3767 | 0 | [440, 296] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155254__306.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3620 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_010749__323 | 0 | 0.0 | 84.2359 | 0 | [440, 443] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_010749__323.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3621 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_154931__397 | 5 | 0.0 | 74.5578 | 4 | [438, 387] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_154931__397.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3622 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_155042__128 | 0 | 0.0 | 71.3681 | 0 | [438, 368] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_155042__128.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3623 | Apple-MacBook-Pro-M1 | ispersonal | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_010625__652 | 0 | 0.0 | 68.8174 | 0 | [438, 352] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_010625__652.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3624 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_011444__132 | 0 | 0.0 | 7.45332 | 0 | [106, 282] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_011444__132.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3625 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_120723__832 | 0 | 0.0 | 12.0675 | 0 | [106, 456] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_120723__832.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3626 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_120730__308 | 0 | 0.0 | 7.3466 | 0 | [106, 278] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_120730__308.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3627 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_120742__112 | 0 | 0.0 | 11.8559 | 0 | [106, 446] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_120742__112.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3628 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_011436__614 | 0 | 0.0 | 6.0428 | 0 | [143, 222] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_011436__614.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3629 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_120658__305 | 0 | 0.0 | 7.39594 | 0 | [143, 274] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_120658__305.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3630 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_120703__722 | 0 | 0.0 | 5.41152 | 0 | [143, 196] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_120703__722.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3631 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_120711__429 | 0 | 0.0 | 7.72115 | 0 | [143, 286] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_120711__429.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3632 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_011430__358 | 0 | 0.0 | 12.5438 | 0 | [215, 330] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011430__358.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3633 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_120635__931 | 0 | 0.0 | 10.9537 | 0 | [215, 257] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_120635__931.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3634 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_120640__213 | 0 | 0.0 | 4.92063 | 0 | [215, 167] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_120640__213.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3635 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_120650__702 | 0 | 0.0 | 10.4192 | 0 | [215, 373] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_120650__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3636 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_011502__462 | 0 | 0.0 | 6.78554 | 0 | [395, 205] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011502__462.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3637 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_120823__839 | 0 | 0.0 | 8.533 | 0 | [395, 268] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120823__839.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3638 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_120830__545 | 0 | 0.0 | 7.02728 | 0 | [395, 210] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120830__545.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3639 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_120844__520 | 0 | 0.0 | 14.7818 | 0 | [395, 486] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_120844__520.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3640 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_011456__993 | 0 | 0.0 | 11.5829 | 0 | [392, 378] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_011456__993.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3641 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_120749__482 | 0 | 0.0 | 6.16703 | 0 | [392, 180] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_120749__482.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3642 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_120801__525 | 0 | 0.0 | 12.6417 | 0 | [392, 411] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_120801__525.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3643 | Apple-MacBook-Pro-M1 | ispersonal | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_120814__532 | 0 | 0.0 | 12.8522 | 0 | [392, 419] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_120814__532.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3644 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | InJulia | 1SHOT | true | false | 5 | 20231214_073313__114 | 0 | 0.0 | 11.4733 | 0 | [113, 330] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__InJulia__1SHOT__20231214_073313__114.json | 25.0 | missing | missing | missing | |
| 3645 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_150549__320 | 0 | 0.0 | 16.4283 | 0 | [113, 473] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__InJulia__1SHOT__20231225_150549__320.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3646 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_150606__884 | 0 | 0.0 | 16.4942 | 0 | [1, 498] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__InJulia__1SHOT__20231225_150606__884.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3647 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | InJulia | 1SHOT | false | false | 5 | 20231227_004408__610 | 0 | 0.0 | 14.19 | 0 | [113, 415] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__InJulia__1SHOT__20231227_004408__610.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3648 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_073301__291 | 0 | 0.0 | 10.3625 | 0 | [142, 286] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_073301__291.json | 25.0 | missing | missing | missing | |
| 3649 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_150519__541 | 0 | 0.0 | 13.0245 | 0 | [142, 365] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_150519__541.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3650 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_150533__433 | 0 | 0.0 | 14.0356 | 0 | [1, 423] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_150533__433.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3651 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_004354__842 | 0 | 0.0 | 9.57441 | 0 | [142, 268] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_004354__842.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3652 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_073251__832 | 0 | 0.0 | 17.3102 | 0 | [217, 455] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073251__832.json | 25.0 | missing | missing | missing | |
| 3653 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_150457__537 | 0 | 0.0 | 12.4388 | 0 | [235, 177] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150457__537.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3654 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_150506__393 | 0 | 0.0 | 8.8189 | 0 | [1, 266] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150506__393.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3655 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_004344__597 | 0 | 0.0 | 13.3088 | 0 | [235, 207] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004344__597.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3656 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_073431__607 | 0 | 0.0 | 26.7839 | 0 | [11, 699] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073431__607.json | 0.0 | missing | missing | missing | |
| 3657 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_150743__320 | 0 | 0.0 | 25.922 | 0 | [11, 684] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150743__320.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3658 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_150803__462 | 0 | 0.0 | 19.9087 | 0 | [1, 541] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150803__462.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3659 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_004500__231 | 0 | 0.0 | 19.1455 | 0 | [11, 521] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004500__231.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3660 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_073404__406 | 0 | 0.0 | 32.971 | 0 | [413, 761] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_073404__406.json | 25.0 | missing | missing | missing | |
| 3661 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_150654__703 | 0 | 0.0 | 26.3314 | 0 | [413, 605] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_150654__703.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3662 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_150717__523 | 0 | 0.0 | 23.1621 | 0 | [1, 622] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_150717__523.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3663 | Apple-MacBook-Pro-M1 | ispersonal | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_004440__219 | 0 | 0.0 | 32.5411 | 0 | [413, 761] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_004440__219.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3664 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | InJulia | 1SHOT | true | false | 5 | 20231214_074416__317 | 0 | 0.0 | 12.3713 | 0 | [113, 356] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__InJulia__1SHOT__20231214_074416__317.json | 25.0 | missing | missing | missing | |
| 3665 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_152636__424 | 4 | 0.0 | 10.3217 | 3 | [113, 334] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__InJulia__1SHOT__20231225_152636__424.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3666 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_152645__271 | 0 | 0.0 | 8.4865 | 0 | [113, 271] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__InJulia__1SHOT__20231225_152645__271.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3667 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | InJulia | 1SHOT | true | true | 5 | 20231227_005359__990 | 0 | 0.0 | 10.6798 | 0 | [113, 343] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__InJulia__1SHOT__20231227_005359__990.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3668 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_074404__875 | 0 | 0.0 | 9.81742 | 0 | [142, 270] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_074404__875.json | 25.0 | missing | missing | missing | |
| 3669 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_152615__880 | 5 | 0.0 | 7.88556 | 4 | [152, 246] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_152615__880.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3670 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_152626__763 | 4 | 0.0 | 10.1604 | 3 | [152, 321] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_152626__763.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3671 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_005348__532 | 5 | 0.0 | 7.71003 | 4 | [152, 239] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_005348__532.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3672 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_074354__514 | 0 | 0.0 | 15.7895 | 0 | [217, 414] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_074354__514.json | 25.0 | missing | missing | missing | |
| 3673 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_152600__946 | 5 | 0.0 | 13.0923 | 4 | [227, 204] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152600__946.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3674 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_152607__712 | 5 | 0.0 | 6.90933 | 4 | [227, 197] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152607__712.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3675 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_005340__842 | 5 | 0.0 | 18.9434 | 4 | [227, 400] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_005340__842.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3676 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_074540__517 | 0 | 0.0 | 31.8717 | 0 | [11, 820] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_074540__517.json | 0.0 | missing | missing | missing | |
| 3677 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_152743__743 | 0 | 0.0 | 10.723 | 0 | [416, 292] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152743__743.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3678 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_152755__612 | 0 | 0.0 | 12.0131 | 0 | [416, 333] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152755__612.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3679 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_005420__902 | 0 | 0.0 | 10.4265 | 0 | [416, 280] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_005420__902.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3680 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_074508__933 | 0 | 0.0 | 31.4976 | 0 | [413, 726] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_074508__933.json | 0.0 | missing | missing | missing | |
| 3681 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_152723__502 | 0 | 0.0 | 13.2438 | 0 | [413, 371] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_152723__502.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3682 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_152732__734 | 0 | 0.0 | 9.02177 | 0 | [413, 238] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_152732__734.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3683 | Apple-MacBook-Pro-M1 | ispersonal | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_005410__240 | 3 | 0.0 | 10.4878 | 4 | [413, 282] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_005410__240.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3684 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_181408__465 | 0 | 0.0 | 15.5665 | 0 | [113, 297] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181408__465.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3685 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_181421__824 | 0 | 0.0 | 13.3397 | 0 | [113, 253] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181421__824.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3686 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_181439__659 | 5 | 0.0 | 18.1706 | 4 | [113, 348] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181439__659.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3687 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_181312__139 | 4 | 0.0 | 17.5705 | 3 | [152, 332] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181312__139.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3688 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_181330__800 | 4 | 0.0 | 18.4887 | 3 | [152, 350] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181330__800.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 3689 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_181352__784 | 0 | 0.0 | 22.1591 | 0 | [152, 421] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181352__784.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3690 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_181226__451 | 0 | 0.0 | 18.5524 | 0 | [227, 339] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181226__451.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3691 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_181243__621 | 0 | 0.0 | 17.0947 | 0 | [227, 311] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181243__621.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3692 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_181254__106 | 1 | 0.0 | 11.2522 | 1 | [227, 197] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181254__106.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 3693 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_181544__599 | 0 | 0.0 | 15.3087 | 0 | [416, 255] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181544__599.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3694 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_181608__727 | 0 | 0.0 | 23.2654 | 0 | [416, 398] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181608__727.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3695 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181629__525 | 0 | 0.0 | 21.0057 | 0 | [416, 341] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181629__525.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3696 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181457__246 | 5 | 0.0 | 17.7422 | 4 | [413, 301] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181457__246.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3697 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_181514__427 | 0 | 0.0 | 16.1417 | 0 | [413, 270] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181514__427.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3698 | Apple-MacBook-Pro-M1 | ispersonal | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181529__469 | 5 | 0.0 | 14.7465 | 4 | [413, 244] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181529__469.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3699 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_155741__939 | 0 | 0.0 | 5.10294 | 0 | [104, 117] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_155741__939.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3700 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_155746__950 | 0 | 0.0 | 5.67622 | 0 | [104, 132] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_155746__950.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3701 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_010952__724 | 0 | 0.0 | 5.04747 | 0 | [104, 115] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_010952__724.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3702 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_155729__364 | 0 | 0.0 | 8.61374 | 0 | [145, 204] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155729__364.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3703 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_155736__503 | 0 | 0.0 | 6.33145 | 0 | [145, 144] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155736__503.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3704 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_010947__663 | 0 | 0.0 | 6.10335 | 0 | [145, 138] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_010947__663.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3705 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_155716__324 | 0 | 0.0 | 24.252 | 0 | [219, 449] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155716__324.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3706 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_155721__340 | 0 | 0.0 | 4.00173 | 0 | [219, 74] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155721__340.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3707 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_010941__277 | 0 | 0.0 | 8.2067 | 0 | [219, 44] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_010941__277.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3708 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_155855__227 | 0 | 0.0 | 18.7205 | 0 | [412, 415] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155855__227.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3709 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_155914__851 | 0 | 0.0 | 19.3838 | 0 | [412, 431] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155914__851.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3710 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_011027__258 | 0 | 0.0 | 11.2751 | 0 | [412, 230] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011027__258.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3711 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_155816__699 | 0 | 0.0 | 14.1113 | 0 | [410, 302] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_155816__699.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3712 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_155836__365 | 0 | 0.0 | 19.755 | 0 | [410, 441] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_155836__365.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3713 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_011015__403 | 0 | 0.0 | 23.6343 | 0 | [410, 532] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_011015__403.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3714 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_235109__901 | 0 | 0.0 | 14.4449 | 0 | [103, 452] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_235109__901.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3715 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_235120__768 | 0 | 0.0 | 10.3994 | 0 | [103, 323] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_235120__768.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3716 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_235130__616 | 0 | 0.0 | 10.3611 | 0 | [103, 322] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_235130__616.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3717 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_235148__534 | 0 | 0.0 | 18.2084 | 0 | [103, 570] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_235148__534.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3718 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231227_235158__120 | 0 | 0.0 | 9.51657 | 0 | [103, 294] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231227_235158__120.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3719 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_235028__155 | 0 | 0.0 | 6.22616 | 0 | [144, 182] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_235028__155.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3720 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_235034__732 | 0 | 0.0 | 5.30238 | 0 | [144, 151] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_235034__732.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3721 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_235039__494 | 0 | 0.0 | 5.4409 | 0 | [144, 156] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_235039__494.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3722 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_235046__692 | 0 | 0.0 | 6.78885 | 0 | [144, 200] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_235046__692.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3723 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_235055__575 | 0 | 0.0 | 9.11697 | 0 | [144, 274] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231227_235055__575.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3724 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_234947__504 | 0 | 0.0 | 10.092 | 0 | [218, 268] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234947__504.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3725 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_234952__395 | 5 | 0.0 | 5.03961 | 4 | [218, 132] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_234952__395.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3726 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_235005__809 | 0 | 0.0 | 12.6276 | 0 | [218, 374] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235005__809.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3727 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_235012__727 | 0 | 0.0 | 7.20025 | 0 | [218, 202] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235012__727.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3728 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_235022__437 | 0 | 0.0 | 10.257 | 0 | [218, 300] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235022__437.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3729 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235332__293 | 0 | 0.0 | 16.1225 | 0 | [411, 445] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235332__293.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3730 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235343__570 | 0 | 0.0 | 10.8657 | 0 | [411, 283] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235343__570.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3731 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235358__234 | 0 | 0.0 | 14.9814 | 0 | [411, 410] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235358__234.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3732 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_235414__498 | 0 | 0.0 | 16.095 | 0 | [411, 444] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235414__498.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3733 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235429__777 | 0 | 0.0 | 14.4418 | 0 | [411, 393] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235429__777.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3734 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_235214__212 | 0 | 0.0 | 16.3224 | 0 | [409, 451] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_235214__212.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3735 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235231__375 | 0 | 0.0 | 16.2442 | 0 | [409, 448] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_235231__375.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3736 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235245__935 | 0 | 0.0 | 14.5914 | 0 | [409, 398] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_235245__935.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3737 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235302__169 | 0 | 0.0 | 16.5361 | 0 | [409, 457] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_235302__169.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3738 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235316__425 | 0 | 0.0 | 14.2829 | 0 | [409, 389] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231227_235316__425.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3739 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_235613__964 | 5 | 0.0 | 11.9654 | 4 | [103, 293] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_235613__964.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3740 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_235626__659 | 0 | 0.0 | 12.8726 | 0 | [103, 316] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_235626__659.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3741 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_235640__684 | 0 | 0.0 | 14.2075 | 0 | [103, 350] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_235640__684.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3742 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_235650__359 | 0 | 0.0 | 10.1751 | 0 | [103, 247] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_235650__359.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3743 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_235707__401 | 0 | 0.0 | 16.3844 | 0 | [103, 405] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231227_235707__401.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3744 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_235530__772 | 0 | 0.0 | 10.7264 | 0 | [144, 256] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_235530__772.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3745 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_235536__700 | 0 | 0.0 | 5.95002 | 0 | [144, 133] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_235536__700.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3746 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_235543__476 | 0 | 0.0 | 7.17292 | 0 | [144, 165] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_235543__476.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3747 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_235549__275 | 3 | 0.0 | 5.94463 | 4 | [144, 133] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_235549__275.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3748 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_235601__277 | 0 | 0.0 | 11.2921 | 0 | [144, 270] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_235601__277.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3749 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_235447__645 | 0 | 0.0 | 18.8293 | 0 | [218, 428] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235447__645.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3750 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_235456__364 | 0 | 0.0 | 8.82706 | 0 | [218, 197] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235456__364.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3751 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_235502__806 | 5 | 0.0 | 6.06019 | 4 | [218, 126] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235502__806.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3752 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_235510__850 | 0 | 0.0 | 7.60673 | 0 | [218, 166] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235510__850.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3753 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_235520__755 | 0 | 0.0 | 9.61687 | 0 | [218, 217] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_235520__755.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3754 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235854__790 | 0 | 0.0 | 17.9169 | 0 | [411, 392] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235854__790.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3755 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235918__949 | 0 | 0.0 | 23.8077 | 0 | [411, 534] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235918__949.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3756 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235932__134 | 0 | 0.0 | 14.0347 | 0 | [411, 297] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235932__134.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3757 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_235947__611 | 0 | 0.0 | 15.3499 | 0 | [411, 329] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_235947__611.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3758 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_000005__779 | 0 | 0.0 | 18.1208 | 0 | [411, 397] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000005__779.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3759 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235723__548 | 0 | 0.0 | 16.2363 | 0 | [409, 351] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_235723__548.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3760 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_235741__939 | 0 | 0.0 | 17.7143 | 0 | [409, 387] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_235741__939.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3761 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235800__614 | 0 | 0.0 | 18.9499 | 0 | [409, 417] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_235800__614.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3762 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235819__627 | 0 | 0.0 | 19.0657 | 0 | [409, 420] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_235819__627.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3763 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_235836__439 | 0 | 0.0 | 17.3335 | 0 | [409, 378] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_235836__439.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3764 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_122418__471 | 5 | 0.0 | 20.8899 | 4 | [103, 381] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122418__471.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3765 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231226_122435__267 | 0 | 0.0 | 16.2782 | 0 | [103, 295] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122435__267.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3766 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_011317__610 | 5 | 0.0 | 15.0399 | 4 | [103, 271] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_011317__610.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3767 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231226_122346__135 | 0 | 0.0 | 8.2443 | 0 | [144, 140] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122346__135.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3768 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231226_122357__638 | 0 | 0.0 | 11.1003 | 0 | [144, 194] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122357__638.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3769 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_011301__671 | 0 | 0.0 | 8.6966 | 0 | [144, 148] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_011301__671.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3770 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_122327__244 | 0 | 0.0 | 8.59542 | 0 | [218, 139] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_122327__244.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3771 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_122338__315 | 3 | 0.0 | 10.4612 | 4 | [218, 174] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_122338__315.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3772 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_011253__562 | 0 | 0.0 | 17.3439 | 0 | [218, 135] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011253__562.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3773 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_122624__395 | 0 | 0.0 | 20.5047 | 0 | [411, 336] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122624__395.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3774 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_122649__841 | 0 | 0.0 | 25.4783 | 0 | [411, 426] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122649__841.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3775 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_011418__726 | 0 | 0.0 | 33.0579 | 0 | [411, 560] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011418__726.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3776 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_122537__255 | 0 | 0.0 | 29.8301 | 0 | [409, 504] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122537__255.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3777 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_122603__755 | 0 | 0.0 | 25.631 | 0 | [409, 429] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122603__755.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3778 | Apple-MacBook-Pro-M1 | ispersonal | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_011345__648 | 0 | 0.0 | 28.0065 | 0 | [409, 470] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_011345__648.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3779 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_121400__767 | 0 | 0.0 | 40.7177 | 0 | [107, 219] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_121400__767.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3780 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_121437__422 | 5 | 0.0 | 36.5609 | 4 | [107, 200] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_121437__422.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3781 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_121531__832 | 5 | 0.0 | 53.7717 | 4 | [107, 277] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_121531__832.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3782 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_152649__866 | 0 | 0.0 | 36.7192 | 0 | [107, 207] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_152649__866.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3783 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_152725__524 | 5 | 0.0 | 36.2257 | 4 | [107, 204] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_152725__524.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3784 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_121202__129 | 0 | 0.0 | 30.845 | 0 | [146, 161] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_121202__129.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3785 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_121232__114 | 5 | 0.0 | 30.307 | 4 | [146, 163] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_121232__114.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3786 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_121319__893 | 5 | 0.0 | 47.1066 | 4 | [146, 264] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_121319__893.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3787 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_152537__167 | 5 | 0.0 | 27.7726 | 4 | [146, 147] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_152537__167.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3788 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_152612__960 | 0 | 0.0 | 35.1585 | 0 | [146, 192] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_152612__960.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3789 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_120950__267 | 5 | 0.0 | 65.5192 | 4 | [217, 314] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_120950__267.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3790 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_121042__378 | 5 | 0.0 | 51.8698 | 4 | [217, 267] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_121042__378.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3791 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_121131__236 | 0 | 0.0 | 48.7247 | 0 | [217, 255] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_121131__236.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3792 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_152423__792 | 0 | 0.0 | 58.1812 | 0 | [217, 319] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_152423__792.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3793 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_152509__265 | 5 | 0.0 | 45.7665 | 4 | [217, 245] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_152509__265.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3794 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_121844__791 | 5 | 0.0 | 69.4579 | 4 | [420, 342] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_121844__791.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3795 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_121915__409 | 0 | 0.0 | 30.2009 | 0 | [420, 109] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_121915__409.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3796 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_122057__558 | 0 | 0.0 | 102.366 | 0 | [420, 515] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_122057__558.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3797 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_153059__844 | 0 | 0.0 | 84.8917 | 0 | [420, 434] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_153059__844.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3798 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_153138__904 | 5 | 0.0 | 39.1712 | 4 | [420, 167] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_153138__904.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3799 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_121614__470 | 5 | 0.0 | 42.9976 | 4 | [418, 183] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_121614__470.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3800 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_121656__186 | 0 | 0.0 | 41.7596 | 0 | [418, 176] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_121656__186.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3801 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_121735__361 | 5 | 0.0 | 39.1205 | 4 | [418, 166] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_121735__361.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3802 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_152824__245 | 5 | 0.0 | 59.0104 | 4 | [418, 284] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_152824__245.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3803 | Apple-MacBook-Pro-M1 | ispersonal | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_152934__696 | 0 | 0.0 | 69.439 | 0 | [418, 345] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_152934__696.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3804 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_160008__585 | 5 | 0.0 | 12.002 | 4 | [112, 295] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_160008__585.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3805 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_160018__659 | 5 | 0.0 | 10.5756 | 4 | [112, 259] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_160018__659.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3806 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_011110__358 | 0 | 0.0 | 12.7941 | 0 | [112, 314] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_011110__358.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3807 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_155946__412 | 0 | 0.0 | 8.13661 | 0 | [153, 191] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155946__412.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3808 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_155956__722 | 0 | 0.0 | 9.1324 | 0 | [153, 216] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155956__722.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3809 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_011057__352 | 0 | 0.0 | 9.34527 | 0 | [153, 221] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_011057__352.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3810 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_155928__480 | 5 | 0.0 | 14.0881 | 4 | [227, 166] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155928__480.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3811 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_155938__553 | 5 | 0.0 | 9.82496 | 4 | [227, 220] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155938__553.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3812 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_011048__547 | 0 | 0.0 | 20.7439 | 0 | [227, 339] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011048__547.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3813 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_160138__846 | 0 | 0.0 | 17.3865 | 0 | [420, 378] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160138__846.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3814 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_160202__206 | 0 | 0.0 | 23.7856 | 4 | [420, 533] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160202__206.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3815 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_011145__897 | 0 | 0.0 | 18.6938 | 0 | [420, 408] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011145__897.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3816 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_160104__569 | 0 | 0.0 | 17.539 | 0 | [418, 382] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_160104__569.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3817 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_160121__101 | 0 | 0.0 | 17.1188 | 0 | [418, 372] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_160121__101.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3818 | Apple-MacBook-Pro-M1 | ispersonal | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_011126__399 | 5 | 0.0 | 15.8569 | 4 | [418, 339] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_011126__399.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3819 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231214_073525__823 | 0 | 0.0 | 23.5377 | 0 | [113, 664] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_073525__823.json | 25.0 | missing | missing | missing | |
| 3820 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231225_150846__685 | 0 | 0.0 | 7.52348 | 0 | [110, 235] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_150846__685.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3821 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231225_150855__348 | 0 | 0.0 | 8.86333 | 0 | [110, 279] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_150855__348.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3822 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231227_004535__583 | 0 | 0.0 | 10.678 | 0 | [110, 337] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_004535__583.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3823 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_073502__999 | 0 | 0.0 | 12.6661 | 0 | [142, 353] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_073502__999.json | 25.0 | missing | missing | missing | |
| 3824 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_150831__637 | 0 | 0.0 | 6.08251 | 0 | [151, 180] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_150831__637.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3825 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_150838__803 | 0 | 0.0 | 7.06358 | 0 | [151, 213] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_150838__803.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3826 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_004524__419 | 0 | 0.0 | 7.57785 | 0 | [151, 230] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_004524__419.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3827 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_073449__698 | 0 | 0.0 | 17.5631 | 0 | [217, 462] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073449__698.json | 25.0 | missing | missing | missing | |
| 3828 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_150813__251 | 0 | 0.0 | 10.5998 | 0 | [225, 143] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150813__251.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3829 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_150825__373 | 0 | 0.0 | 11.8628 | 0 | [225, 354] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_150825__373.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3830 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_004517__953 | 0 | 0.0 | 17.157 | 0 | [225, 363] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004517__953.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3831 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_073626__925 | 0 | 0.0 | 25.0838 | 0 | [11, 659] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073626__925.json | 25.0 | missing | missing | missing | |
| 3832 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_150950__754 | 0 | 0.0 | 10.2712 | 0 | [418, 268] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_150950__754.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3833 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_151006__312 | 0 | 0.0 | 16.2016 | 0 | [418, 452] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_151006__312.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3834 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_004602__946 | 5 | 0.0 | 10.9632 | 4 | [418, 288] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004602__946.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3835 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_073600__896 | 0 | 0.0 | 20.274 | 0 | [413, 449] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_073600__896.json | 25.0 | missing | missing | missing | |
| 3836 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_150925__459 | 0 | 0.0 | 8.83616 | 0 | [416, 227] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_150925__459.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3837 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_150939__802 | 0 | 0.0 | 14.1221 | 0 | [416, 393] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_150939__802.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3838 | Apple-MacBook-Pro-M1 | ispersonal | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_004551__218 | 0 | 0.0 | 16.0876 | 0 | [416, 451] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_004551__218.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3839 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231214_074803__733 | 0 | 0.0 | 12.8417 | 0 | [113, 370] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__InJulia__1SHOT__20231214_074803__733.json | 25.0 | missing | missing | missing | |
| 3840 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_153029__310 | 0 | 0.0 | 11.9424 | 0 | [116, 209] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__InJulia__1SHOT__20231225_153029__310.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3841 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_153038__804 | 0 | 0.0 | 9.0308 | 0 | [116, 153] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__InJulia__1SHOT__20231225_153038__804.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3842 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231227_005603__263 | 0 | 0.0 | 13.8233 | 0 | [116, 243] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__InJulia__1SHOT__20231227_005603__263.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3843 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_074750__215 | 0 | 0.0 | 9.32874 | 0 | [142, 255] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_074750__215.json | 25.0 | missing | missing | missing | |
| 3844 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_153005__150 | 0 | 0.0 | 4.16475 | 0 | [155, 55] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_153005__150.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3845 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_153017__625 | 0 | 0.0 | 11.6633 | 0 | [155, 197] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_153017__625.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3846 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_005550__196 | 0 | 0.0 | 6.20877 | 0 | [155, 94] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_005550__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3847 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_074740__886 | 0 | 0.0 | 16.2354 | 0 | [217, 426] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_074740__886.json | 25.0 | missing | missing | missing | |
| 3848 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_152942__532 | 0 | 0.0 | 27.9905 | 0 | [230, 310] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152942__532.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3849 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_153001__212 | 0 | 0.0 | 19.601 | 0 | [230, 328] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_153001__212.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3850 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_005543__639 | 0 | 0.0 | 37.8965 | 0 | [230, 493] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_005543__639.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3851 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_074913__516 | 0 | 0.0 | 24.3648 | 0 | [11, 643] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_074913__516.json | 0.0 | missing | missing | missing | |
| 3852 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_153221__270 | 0 | 0.0 | 26.3958 | 0 | [419, 410] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_153221__270.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3853 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_153247__214 | 0 | 0.0 | 26.239 | 0 | [419, 407] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_153247__214.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3854 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_005650__384 | 0 | 0.0 | 20.4733 | 0 | [419, 305] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_005650__384.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3855 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_074849__567 | 0 | 0.0 | 34.082 | 0 | [413, 787] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_074849__567.json | 25.0 | missing | missing | missing | |
| 3856 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_153120__953 | 0 | 0.0 | 24.0239 | 0 | [416, 373] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_153120__953.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3857 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_153154__183 | 0 | 0.0 | 33.9153 | 0 | [416, 543] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_153154__183.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3858 | Apple-MacBook-Pro-M1 | ispersonal | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_005629__785 | 0 | 0.0 | 25.8194 | 0 | [416, 402] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_005629__785.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3859 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_160249__543 | 0 | 0.0 | 22.505 | 0 | [100, 835] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_160249__543.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3860 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_160310__234 | 0 | 0.0 | 20.8945 | 0 | [100, 779] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_160310__234.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3861 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_011218__740 | 0 | 0.0 | 14.8652 | 0 | [100, 560] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_011218__740.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3862 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_160214__205 | 0 | 0.0 | 6.42365 | 0 | [137, 239] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_160214__205.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3863 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_160226__694 | 0 | 0.0 | 11.8793 | 0 | [137, 446] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_160226__694.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3864 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_011203__560 | 0 | 0.0 | 14.3323 | 0 | [137, 533] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_011203__560.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3865 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_160207__583 | 0 | 0.0 | 4.3899 | 0 | [209, 2] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160207__583.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3866 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_160208__442 | 0 | 0.0 | 0.915709 | 0 | [209, 11] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_160208__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3867 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_011149__288 | 0 | 0.0 | 4.34352 | 0 | [209, 10] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011149__288.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3868 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_160413__229 | 0 | 0.0 | 8.93036 | 0 | [389, 286] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160413__229.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3869 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_160424__830 | 0 | 0.0 | 10.8322 | 0 | [389, 354] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_160424__830.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3870 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_011235__982 | 0 | 0.0 | 8.25612 | 0 | [389, 259] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011235__982.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3871 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_160351__453 | 0 | 0.0 | 12.4 | 0 | [386, 410] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_160351__453.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3872 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_160404__618 | 0 | 0.0 | 12.8588 | 0 | [386, 427] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_160404__618.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3873 | Apple-MacBook-Pro-M1 | ispersonal | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_011227__521 | 0 | 0.0 | 9.0866 | 0 | [386, 290] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_011227__521.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3874 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231214_075006__883 | 0 | 0.0 | 15.5307 | 0 | [113, 446] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_075006__883.json | 25.0 | missing | missing | missing | |
| 3875 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231225_153545__855 | 0 | 0.0 | 34.2947 | 0 | [124, 259] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_153545__855.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3876 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_153623__841 | 0 | 0.0 | 38.3017 | 0 | [124, 291] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_153623__841.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3877 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231227_005918__119 | 0 | 0.0 | 56.7 | 0 | [124, 432] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_005918__119.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3878 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_074950__386 | 0 | 0.0 | 17.9148 | 0 | [142, 500] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_074950__386.json | 25.0 | missing | missing | missing | |
| 3879 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_153432__726 | 0 | 0.0 | 28.2964 | 0 | [163, 200] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_153432__726.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3880 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_153511__760 | 0 | 0.0 | 38.1692 | 0 | [163, 279] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_153511__760.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3881 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_005821__527 | 0 | 0.0 | 27.4833 | 0 | [163, 193] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_005821__527.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3882 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_074932__833 | 0 | 0.0 | 18.7133 | 0 | [217, 494] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_074932__833.json | 25.0 | missing | missing | missing | |
| 3883 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_153334__129 | 5 | 0.0 | 46.8463 | 4 | [238, 160] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_153334__129.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3884 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_153404__372 | 5 | 0.0 | 30.0718 | 4 | [238, 202] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_153404__372.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3885 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_005754__980 | 0 | 0.0 | 64.1516 | 0 | [238, 307] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_005754__980.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3886 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_075120__174 | 0 | 0.0 | 25.8988 | 0 | [11, 679] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075120__174.json | 50.0 | missing | missing | missing | |
| 3887 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_153947__108 | 0 | 0.0 | 49.5844 | 0 | [427, 318] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_153947__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3888 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_154029__485 | 0 | 0.0 | 41.3716 | 0 | [427, 254] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_154029__485.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3889 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_010125__990 | 0 | 0.0 | 62.4156 | 0 | [427, 412] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_010125__990.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3890 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_075054__801 | 0 | 0.0 | 33.3806 | 0 | [413, 771] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_075054__801.json | 25.0 | missing | missing | missing | |
| 3891 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_153816__699 | 0 | 0.0 | 30.6252 | 0 | [424, 172] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_153816__699.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3892 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_153857__759 | 0 | 0.0 | 40.6102 | 0 | [424, 250] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_153857__759.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3893 | Apple-MacBook-Pro-M1 | ispersonal | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_010022__451 | 0 | 0.0 | 63.9743 | 0 | [424, 427] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_010022__451.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3894 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_155417__822 | 0 | 0.0 | 24.834 | 0 | [112, 417] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_155417__822.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3895 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_155438__612 | 0 | 0.0 | 20.7802 | 0 | [112, 347] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_155438__612.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3896 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_010850__768 | 0 | 0.0 | 21.8488 | 0 | [112, 364] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_010850__768.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3897 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_155341__560 | 0 | 0.0 | 14.8303 | 0 | [153, 239] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155341__560.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3898 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_155353__320 | 0 | 0.0 | 11.3318 | 0 | [153, 178] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_155353__320.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3899 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_010829__777 | 0 | 0.0 | 12.5769 | 0 | [153, 199] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_010829__777.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3900 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_155315__248 | 0 | 0.0 | 20.0453 | 0 | [227, 156] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155315__248.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3901 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_155326__871 | 0 | 0.0 | 11.4974 | 0 | [227, 167] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_155326__871.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3902 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_010816__296 | 0 | 0.0 | 26.6556 | 0 | [227, 278] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_010816__296.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3903 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_155621__437 | 0 | 0.0 | 20.0642 | 0 | [420, 283] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155621__437.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3904 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_155652__244 | 0 | 0.0 | 30.7366 | 0 | [420, 459] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_155652__244.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3905 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_010932__808 | 0 | 0.0 | 22.918 | 0 | [420, 329] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_010932__808.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3906 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_155536__412 | 0 | 0.0 | 18.7533 | 0 | [418, 261] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_155536__412.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3907 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_155601__641 | 0 | 0.0 | 25.5785 | 0 | [418, 375] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_155601__641.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3908 | Apple-MacBook-Pro-M1 | ispersonal | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_010909__172 | 0 | 0.0 | 18.9263 | 0 | [418, 263] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_010909__172.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3909 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231214_074620__866 | 0 | 0.0 | 14.8885 | 0 | [113, 428] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_074620__866.json | 0.0 | missing | missing | missing | |
| 3910 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_152831__179 | 0 | 0.0 | 5.60649 | 0 | [113, 314] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_152831__179.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3911 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231225_152838__318 | 0 | 0.0 | 6.95534 | 0 | [113, 388] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_152838__318.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3912 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231227_005447__944 | 0 | 0.0 | 8.17257 | 0 | [113, 450] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_005447__944.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3913 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_074605__350 | 0 | 0.0 | 9.50168 | 0 | [142, 261] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_074605__350.json | 25.0 | missing | missing | missing | |
| 3914 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_152818__378 | 0 | 0.0 | 5.96426 | 0 | [150, 325] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_152818__378.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3915 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_152826__428 | 0 | 0.0 | 8.29983 | 0 | [150, 452] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_152826__428.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3916 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_005439__117 | 0 | 0.0 | 9.60422 | 0 | [150, 515] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_005439__117.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3917 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_074555__778 | 0 | 0.0 | 15.3458 | 0 | [217, 402] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_074555__778.json | 25.0 | missing | missing | missing | |
| 3918 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_152805__340 | 0 | 0.0 | 10.3507 | 0 | [220, 391] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152805__340.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3919 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_152812__997 | 0 | 0.0 | 6.21582 | 0 | [220, 322] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_152812__997.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3920 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_005429__995 | 0 | 0.0 | 9.05795 | 0 | [220, 331] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_005429__995.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3921 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_074724__911 | 0 | 0.0 | 27.6163 | 0 | [11, 721] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_074724__911.json | 25.0 | missing | missing | missing | |
| 3922 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_152908__981 | 0 | 0.0 | 6.26613 | 0 | [400, 275] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152908__981.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3923 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_152914__724 | 0 | 0.0 | 5.59033 | 0 | [400, 240] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152914__724.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3924 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_005505__132 | 0 | 0.0 | 7.62989 | 0 | [400, 342] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_005505__132.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3925 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_074656__161 | 0 | 0.0 | 22.6454 | 0 | [413, 510] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_074656__161.json | 25.0 | missing | missing | missing | |
| 3926 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_152855__499 | 0 | 0.0 | 8.69609 | 0 | [398, 399] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_152855__499.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3927 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_152902__635 | 0 | 0.0 | 6.42482 | 0 | [398, 284] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_152902__635.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3928 | Apple-MacBook-Pro-M1 | ispersonal | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_005458__737 | 0 | 0.0 | 10.7113 | 0 | [398, 494] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_005458__737.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3929 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231214_073653__836 | 0 | 0.0 | 12.5901 | 0 | [113, 363] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_073653__836.json | 25.0 | missing | missing | missing | |
| 3930 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231225_151057__332 | 0 | 0.0 | 8.89659 | 0 | [112, 280] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_151057__332.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3931 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_151110__308 | 0 | 0.0 | 13.4202 | 0 | [112, 427] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_151110__308.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3932 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231227_004632__406 | 0 | 0.0 | 7.8559 | 0 | [112, 244] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_004632__406.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3933 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_073640__690 | 0 | 0.0 | 6.9744 | 0 | [142, 185] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_073640__690.json | 25.0 | missing | missing | missing | |
| 3934 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_151037__824 | 0 | 0.0 | 9.7004 | 0 | [153, 300] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_151037__824.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3935 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_151048__166 | 0 | 0.0 | 11.3251 | 0 | [153, 353] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_151048__166.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3936 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_004624__453 | 0 | 0.0 | 10.7386 | 0 | [153, 332] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_004624__453.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3937 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_073633__704 | 0 | 0.0 | 7.43955 | 0 | [217, 176] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073633__704.json | 0.0 | missing | missing | missing | |
| 3938 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_151022__215 | 0 | 0.0 | 15.5728 | 0 | [227, 302] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_151022__215.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3939 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_151027__307 | 0 | 0.0 | 5.23907 | 0 | [227, 137] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_151027__307.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3940 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_004614__739 | 0 | 0.0 | 11.3667 | 0 | [227, 169] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004614__739.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3941 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_073756__157 | 0 | 0.0 | 20.3255 | 0 | [11, 543] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073756__157.json | 25.0 | missing | missing | missing | |
| 3942 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_151207__289 | 0 | 0.0 | 11.6428 | 0 | [420, 311] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_151207__289.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3943 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_151220__379 | 0 | 0.0 | 12.9266 | 0 | [420, 350] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_151220__379.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3944 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_004652__892 | 0 | 0.0 | 10.6362 | 0 | [420, 277] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_004652__892.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3945 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_073736__333 | 0 | 0.0 | 26.4857 | 0 | [413, 605] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_073736__333.json | 25.0 | missing | missing | missing | |
| 3946 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_151145__988 | 0 | 0.0 | 14.4238 | 0 | [418, 397] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_151145__988.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3947 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_151155__839 | 0 | 0.0 | 10.1827 | 0 | [418, 265] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_151155__839.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3948 | Apple-MacBook-Pro-M1 | ispersonal | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_004642__156 | 0 | 0.0 | 9.52075 | 0 | [418, 242] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_004642__156.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3949 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231214_073838__120 | 0 | 0.0 | 14.4259 | 0 | [113, 415] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_073838__120.json | 25.0 | missing | missing | missing | |
| 3950 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_151446__498 | 5 | 0.0 | 39.8228 | 4 | [107, 293] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_151446__498.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3951 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_151535__826 | 0 | 0.0 | 48.8008 | 0 | [107, 362] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_151535__826.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3952 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231227_004855__981 | 0 | 0.0 | 71.9964 | 0 | [107, 537] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_004855__981.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3953 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_073824__554 | 0 | 0.0 | 11.6721 | 0 | [142, 324] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_073824__554.json | 25.0 | missing | missing | missing | |
| 3954 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_151341__260 | 0 | 0.0 | 22.9643 | 0 | [146, 156] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_151341__260.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3955 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_151406__948 | 0 | 0.0 | 25.1358 | 0 | [146, 173] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_151406__948.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3956 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_004743__598 | 0 | 0.0 | 21.8247 | 0 | [146, 147] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_004743__598.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3957 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_073812__226 | 0 | 0.0 | 15.7743 | 0 | [217, 413] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_073812__226.json | 0.0 | missing | missing | missing | |
| 3958 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_151300__422 | 0 | 0.0 | 40.0007 | 0 | [217, 76] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_151300__422.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3959 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_151318__664 | 0 | 0.0 | 18.3943 | 0 | [217, 109] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_151318__664.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3960 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_004721__761 | 0 | 0.0 | 28.381 | 0 | [217, 11] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_004721__761.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3961 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_073938__360 | 0 | 0.0 | 25.1037 | 0 | [11, 661] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_073938__360.json | 0.0 | missing | missing | missing | |
| 3962 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_152004__909 | 5 | 0.0 | 43.6745 | 4 | [420, 262] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152004__909.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3963 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_152122__171 | 0 | 0.0 | 77.8006 | 0 | [420, 512] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_152122__171.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3964 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_005151__796 | 0 | 0.0 | 89.9002 | 0 | [420, 597] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_005151__796.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3965 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_073913__992 | 0 | 0.0 | 24.5522 | 0 | [413, 557] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_073913__992.json | 25.0 | missing | missing | missing | |
| 3966 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_151835__515 | 0 | 0.0 | 61.2711 | 0 | [418, 392] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_151835__515.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3967 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_151920__873 | 5 | 0.0 | 45.3965 | 4 | [418, 275] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_151920__873.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3968 | Apple-MacBook-Pro-M1 | ispersonal | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_005021__313 | 0 | 0.0 | 86.3958 | 0 | [418, 573] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/ispersonal/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_005021__313.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3969 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231214_075653__860 | 0 | 0.0 | 10.7123 | 0 | [69, 320] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_075653__860.json | 25.0 | missing | missing | missing | |
| 3970 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_115301__252 | 3 | 0.0 | 10.0062 | 4 | [77, 178] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_115301__252.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3971 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_115312__718 | 1 | 0.0 | 10.6232 | 4 | [77, 190] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_115312__718.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3972 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231227_012209__407 | 3 | 0.0 | 10.9913 | 4 | [77, 196] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_012209__407.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3973 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_075642__355 | 0 | 0.0 | 8.59461 | 0 | [98, 246] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_075642__355.json | 25.0 | missing | missing | missing | |
| 3974 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_115243__607 | 5 | 0.0 | 12.3716 | 4 | [115, 218] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_115243__607.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3975 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_115251__421 | 3 | 0.0 | 7.50932 | 4 | [115, 125] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_115251__421.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3976 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_012158__733 | 5 | 0.0 | 6.43846 | 4 | [115, 104] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_012158__733.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3977 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_075634__369 | 0 | 0.0 | 11.53 | 0 | [188, 305] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075634__369.json | 0.0 | missing | missing | missing | |
| 3978 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_115224__701 | 0 | 0.0 | 14.6442 | 0 | [206, 59] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115224__701.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3979 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_115231__246 | 2 | 0.0 | 6.39734 | 4 | [206, 88] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115231__246.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3980 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_012151__721 | 0 | 0.0 | 12.6192 | 0 | [206, 28] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012151__721.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3981 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_075730__304 | 0 | 0.0 | 17.7795 | 0 | [11, 485] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075730__304.json | 25.0 | missing | missing | missing | |
| 3982 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_115407__616 | 0 | 0.0 | 17.4861 | 0 | [380, 265] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115407__616.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3983 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_115426__317 | 3 | 0.0 | 18.8938 | 4 | [380, 290] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115426__317.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3984 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_012234__349 | 3 | 0.0 | 16.7666 | 4 | [380, 250] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012234__349.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3985 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_075712__185 | 0 | 0.0 | 12.8798 | 0 | [369, 273] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_075712__185.json | 0.0 | missing | missing | missing | |
| 3986 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115335__184 | 0 | 0.0 | 16.6928 | 0 | [377, 250] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_115335__184.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3987 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115350__147 | 5 | 0.0 | 14.2006 | 4 | [377, 206] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_115350__147.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3988 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_012217__829 | 5 | 0.0 | 8.20408 | 4 | [377, 95] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_012217__829.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3989 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231214_075806__360 | 0 | 0.0 | 9.69465 | 0 | [69, 289] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_075806__360.json | 25.0 | missing | missing | missing | |
| 3990 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_115456__608 | 0 | 0.0 | 4.01778 | 0 | [51, 68] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_115456__608.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3991 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_115500__436 | 0 | 0.0 | 4.48762 | 0 | [51, 77] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_115500__436.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3992 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075756__804 | 1 | 0.0 | 6.53698 | 0 | [98, 184] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_075756__804.json | 55.0 | missing | missing | missing | |
| 3993 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115445__558 | 0 | 0.0 | 2.38713 | 0 | [52, 36] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_115445__558.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3994 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115452__743 | 0 | 0.0 | 6.92008 | 0 | [52, 124] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_115452__743.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3995 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_075750__401 | 0 | 0.0 | 19.5067 | 0 | [188, 527] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075750__401.json | 0.0 | missing | missing | missing | |
| 3996 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_115439__149 | 0 | 0.0 | 12.7531 | 0 | [81, 37] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115439__149.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3997 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_115442__757 | 0 | 0.0 | 3.12204 | 0 | [81, 45] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115442__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 3998 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_075851__102 | 0 | 0.0 | 19.3053 | 0 | [11, 525] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075851__102.json | 50.0 | missing | missing | missing | |
| 3999 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_115526__376 | 0 | 0.0 | 1.36246 | 0 | [69, 11] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115526__376.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4000 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_115532__926 | 0 | 0.0 | 6.56186 | 0 | [69, 112] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115532__926.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4001 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_075832__708 | 0 | 0.0 | 16.6763 | 0 | [369, 375] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_075832__708.json | 0.0 | missing | missing | missing | |
| 4002 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_115517__968 | 0 | 0.0 | 5.3688 | 0 | [66, 89] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_115517__968.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4003 | Apple-MacBook-Pro-M1 | keep_only_names | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_115525__395 | 0 | 0.0 | 7.39124 | 0 | [66, 128] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_115525__395.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4004 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121031__759 | 5 | 0.0 | 22.4166 | 4 | [70, 129] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_121031__759.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4005 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121117__511 | 1 | 0.0 | 45.1029 | 0 | [70, 272] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_121117__511.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4006 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_013007__626 | 5 | 0.0 | 32.9127 | 4 | [70, 195] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_013007__626.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4007 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_120946__321 | 0 | 0.0 | 32.2184 | 0 | [111, 185] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_120946__321.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4008 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_121009__470 | 0 | 0.0 | 22.6671 | 0 | [111, 125] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_121009__470.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4009 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_012934__902 | 0 | 0.0 | 8.15728 | 0 | [111, 33] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_012934__902.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4010 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_120844__611 | 0 | 0.0 | 52.634 | 0 | [201, 136] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_120844__611.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4011 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_120914__210 | 0 | 0.0 | 30.0926 | 0 | [201, 156] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_120914__210.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4012 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_012926__418 | 0 | 0.0 | 52.6361 | 0 | [201, 151] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012926__418.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4013 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_121406__310 | 4 | 0.0 | 35.7423 | 4 | [399, 159] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121406__310.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4014 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_121446__568 | 0 | 0.0 | 40.2936 | 0 | [399, 187] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121446__568.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4015 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013125__337 | 3 | 0.0 | 46.6654 | 4 | [399, 225] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013125__337.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4016 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_121257__530 | 5 | 0.0 | 49.4935 | 4 | [397, 243] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121257__530.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4017 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_121330__115 | 3 | 0.0 | 32.3043 | 4 | [397, 138] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121330__115.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4018 | Apple-MacBook-Pro-M1 | keep_only_names | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013038__836 | 3 | 0.0 | 30.2153 | 4 | [397, 125] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_013038__836.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4019 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_013649__443 | 0 | 0.0 | 5.20229 | 0 | [71, 199] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_013649__443.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4020 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_122137__592 | 0 | 0.0 | 8.17088 | 0 | [71, 315] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_122137__592.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4021 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_122143__936 | 0 | 0.0 | 5.28845 | 0 | [71, 203] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_122143__936.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4022 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_122150__591 | 0 | 0.0 | 7.48893 | 0 | [71, 287] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_122150__591.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4023 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013644__224 | 0 | 0.0 | 6.33872 | 0 | [108, 239] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_013644__224.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4024 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_122118__686 | 0 | 0.0 | 5.14139 | 0 | [108, 191] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_122118__686.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4025 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_122124__174 | 3 | 0.0 | 5.51818 | 4 | [108, 203] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_122124__174.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4026 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_122129__552 | 0 | 0.0 | 5.12059 | 0 | [108, 191] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_122129__552.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4027 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_013637__206 | 0 | 0.0 | 5.12179 | 0 | [196, 44] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013637__206.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4028 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_122106__669 | 0 | 0.0 | 8.33083 | 0 | [196, 165] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122106__669.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4029 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_122107__715 | 0 | 0.0 | 1.78343 | 0 | [196, 46] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122107__715.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4030 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_122113__909 | 0 | 0.0 | 4.97684 | 0 | [196, 171] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122113__909.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4031 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013658__454 | 0 | 0.0 | 4.95156 | 0 | [360, 143] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013658__454.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4032 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_122231__103 | 0 | 0.0 | 10.6299 | 4 | [360, 351] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_122231__103.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4033 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_122241__648 | 0 | 0.0 | 9.9592 | 0 | [360, 326] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_122241__648.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4034 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_122248__788 | 0 | 0.0 | 7.34004 | 0 | [360, 230] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_122248__788.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4035 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013653__699 | 0 | 0.0 | 4.00686 | 0 | [357, 108] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_013653__699.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4036 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_122203__144 | 1 | 0.0 | 13.01 | 0 | [357, 434] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_122203__144.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4037 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_122214__348 | 0 | 0.0 | 10.071 | 0 | [357, 331] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_122214__348.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4038 | Apple-MacBook-Pro-M1 | keep_only_names | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_122220__427 | 0 | 0.0 | 6.53903 | 0 | [357, 202] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_122220__427.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4039 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | InJulia | 1SHOT | true | false | 5 | 20231214_075147__232 | 0 | 0.0 | 9.14014 | 0 | [69, 271] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__InJulia__1SHOT__20231214_075147__232.json | 25.0 | missing | missing | missing | |
| 4040 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_113939__287 | 0 | 0.0 | 11.0209 | 0 | [69, 331] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__InJulia__1SHOT__20231225_113939__287.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4041 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | InJulia | 1SHOT | true | true | 5 | 20231225_113950__879 | 3 | 0.0 | 10.8064 | 4 | [1, 341] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__InJulia__1SHOT__20231225_113950__879.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4042 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | InJulia | 1SHOT | true | true | 5 | 20231227_011538__125 | 0 | 0.0 | 7.97343 | 4 | [69, 241] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__InJulia__1SHOT__20231227_011538__125.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4043 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075138__953 | 1 | 0.0 | 6.77455 | 0 | [98, 190] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_075138__953.json | 55.0 | missing | missing | missing | |
| 4044 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_113920__716 | 0 | 0.0 | 8.37718 | 0 | [98, 241] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_113920__716.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4045 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_113928__561 | 0 | 0.0 | 7.45368 | 0 | [1, 235] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_113928__561.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4046 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_011530__233 | 2 | 0.0 | 7.76225 | 4 | [98, 225] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_011530__233.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4047 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_075131__327 | 1 | 0.0 | 11.0089 | 0 | [188, 288] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075131__327.json | 55.0 | missing | missing | missing | |
| 4048 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_113904__172 | 1 | 0.0 | 20.2761 | 0 | [206, 398] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113904__172.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4049 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_113912__726 | 1 | 0.0 | 7.32642 | 0 | [1, 224] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113912__726.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4050 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_011522__320 | 1 | 0.0 | 19.4021 | 0 | [206, 401] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011522__320.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4051 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_075249__867 | 0 | 0.0 | 30.8058 | 0 | [11, 806] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075249__867.json | 50.0 | missing | missing | missing | |
| 4052 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_114107__648 | 0 | 0.0 | 16.4183 | 0 | [11, 452] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114107__648.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4053 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_114140__671 | 0 | 0.0 | 32.8506 | 0 | [1, 862] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114140__671.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4054 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_011617__163 | 0 | 0.0 | 18.3149 | 0 | [11, 506] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011617__163.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4055 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_075219__337 | 0 | 0.0 | 21.922 | 0 | [369, 512] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_075219__337.json | 25.0 | missing | missing | missing | |
| 4056 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_114029__281 | 0 | 0.0 | 19.2851 | 0 | [369, 447] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_114029__281.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4057 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_114050__407 | 2 | 0.0 | 20.8201 | 4 | [1, 571] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_114050__407.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4058 | Apple-MacBook-Pro-M1 | keep_only_names | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_011558__697 | 1 | 0.0 | 20.139 | 4 | [369, 473] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_011558__697.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4059 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | InJulia | 1SHOT | true | false | 5 | 20231214_075925__727 | 0 | 0.0 | 10.7911 | 0 | [69, 322] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__InJulia__1SHOT__20231214_075925__727.json | 25.0 | missing | missing | missing | |
| 4060 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_115611__129 | 5 | 0.0 | 3.87066 | 4 | [69, 120] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__InJulia__1SHOT__20231225_115611__129.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4061 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_115617__699 | 5 | 0.0 | 5.7648 | 4 | [69, 186] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__InJulia__1SHOT__20231225_115617__699.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4062 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | InJulia | 1SHOT | true | true | 5 | 20231227_012301__545 | 5 | 0.0 | 7.67375 | 4 | [69, 249] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__InJulia__1SHOT__20231227_012301__545.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4063 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075914__609 | 1 | 0.0 | 8.57943 | 0 | [98, 246] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_075914__609.json | 55.0 | missing | missing | missing | |
| 4064 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_115559__493 | 1 | 0.0 | 7.70762 | 0 | [108, 247] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_115559__493.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4065 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_115607__814 | 5 | 0.0 | 7.12072 | 4 | [108, 225] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_115607__814.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4066 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_012253__535 | 5 | 0.0 | 6.41564 | 4 | [108, 201] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_012253__535.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4067 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_075906__373 | 0 | 0.0 | 14.1637 | 0 | [188, 380] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075906__373.json | 0.0 | missing | missing | missing | |
| 4068 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_115545__755 | 1 | 0.0 | 12.9754 | 4 | [198, 207] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115545__755.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4069 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_115552__214 | 5 | 0.0 | 6.03938 | 4 | [198, 174] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115552__214.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4070 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_012247__408 | 5 | 0.0 | 12.5072 | 4 | [198, 196] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012247__408.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4071 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_080019__909 | 1 | 0.0 | 17.8886 | 4 | [11, 488] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080019__909.json | 80.0 | missing | missing | missing | |
| 4072 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_115644__645 | 5 | 0.0 | 6.42166 | 4 | [372, 159] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115644__645.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4073 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_115657__683 | 1 | 0.0 | 13.0512 | 0 | [372, 372] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115657__683.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4074 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_012318__322 | 0 | 0.0 | 8.25777 | 0 | [372, 218] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012318__322.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4075 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_080001__644 | 0 | 0.0 | 27.6734 | 0 | [369, 656] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_080001__644.json | 50.0 | missing | missing | missing | |
| 4076 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115632__599 | 5 | 0.0 | 6.90201 | 4 | [369, 175] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_115632__599.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4077 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115638__214 | 3 | 0.0 | 5.04292 | 4 | [369, 114] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_115638__214.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4078 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_012310__650 | 5 | 0.0 | 8.13264 | 4 | [369, 214] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_012310__650.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4079 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_181743__639 | 5 | 0.0 | 11.331 | 4 | [69, 207] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181743__639.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4080 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_181755__859 | 5 | 0.0 | 11.6128 | 4 | [69, 208] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181755__859.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4081 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_181805__878 | 5 | 0.0 | 10.0143 | 4 | [69, 180] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_181805__878.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4082 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_181716__581 | 1 | 0.0 | 12.4516 | 4 | [108, 222] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181716__581.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4083 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_181725__970 | 5 | 0.0 | 8.43832 | 4 | [108, 145] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181725__970.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4084 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_181732__789 | 1 | 0.0 | 6.9484 | 0 | [108, 119] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_181732__789.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4085 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_181641__913 | 5 | 0.0 | 11.7527 | 4 | [198, 192] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181641__913.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4086 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_181651__526 | 5 | 0.0 | 10.184 | 4 | [198, 170] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181651__526.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4087 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_181703__902 | 5 | 0.0 | 12.1288 | 4 | [198, 202] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181703__902.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4088 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181856__900 | 5 | 0.0 | 14.5329 | 4 | [372, 238] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181856__900.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4089 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181905__266 | 1 | 0.0 | 8.94161 | 0 | [372, 133] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181905__266.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4090 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_181914__951 | 1 | 0.0 | 8.04417 | 0 | [372, 116] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_181914__951.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4091 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181818__688 | 1 | 0.0 | 12.4553 | 0 | [369, 203] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181818__688.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4092 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181832__401 | 0 | 0.0 | 14.2693 | 0 | [369, 228] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181832__401.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4093 | Apple-MacBook-Pro-M1 | keep_only_names | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_181842__770 | 5 | 0.0 | 9.19414 | 4 | [369, 133] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_181842__770.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4094 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121844__836 | 0 | 0.0 | 5.88966 | 0 | [67, 143] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_121844__836.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4095 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121851__254 | 3 | 0.0 | 6.24507 | 4 | [67, 152] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_121851__254.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4096 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_013322__878 | 3 | 0.0 | 5.85527 | 4 | [67, 141] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_013322__878.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4097 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_121836__459 | 1 | 0.0 | 2.03387 | 0 | [108, 36] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_121836__459.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4098 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_121838__244 | 0 | 0.0 | 2.00773 | 0 | [108, 35] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_121838__244.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4099 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013316__556 | 3 | 0.0 | 2.38388 | 4 | [108, 45] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_013316__556.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4100 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_121827__746 | 1 | 0.0 | 15.7963 | 0 | [198, 236] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_121827__746.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4101 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_121834__226 | 0 | 0.0 | 7.05053 | 0 | [198, 154] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_121834__226.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4102 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_013313__283 | 3 | 0.0 | 17.9174 | 4 | [198, 294] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013313__283.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4103 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_121940__506 | 1 | 0.0 | 16.0412 | 0 | [375, 356] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121940__506.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4104 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_121954__681 | 0 | 0.0 | 14.0419 | 0 | [375, 306] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121954__681.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4105 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013346__976 | 0 | 0.0 | 13.3496 | 0 | [375, 287] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013346__976.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4106 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_121916__980 | 1 | 0.0 | 10.7637 | 0 | [373, 224] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121916__980.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4107 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_121923__605 | 1 | 0.0 | 7.27939 | 0 | [373, 136] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121923__605.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4108 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013333__653 | 0 | 0.0 | 10.421 | 0 | [373, 214] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_013333__653.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4109 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_000117__749 | 0 | 0.0 | 6.72073 | 4 | [66, 209] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_000117__749.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4110 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_000125__870 | 0 | 0.0 | 7.91929 | 0 | [66, 248] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_000125__870.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4111 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231228_000138__578 | 0 | 0.0 | 12.6277 | 0 | [66, 401] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_000138__578.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4112 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231228_000147__265 | 0 | 0.0 | 8.94519 | 0 | [66, 282] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_000147__265.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4113 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231228_000156__782 | 0 | 0.0 | 9.09851 | 0 | [66, 287] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_000156__782.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4114 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000100__936 | 1 | 0.0 | 1.83867 | 0 | [107, 41] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_000100__936.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4115 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000105__216 | 1 | 0.0 | 4.1842 | 0 | [107, 120] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_000105__216.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4116 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_000107__494 | 0 | 0.0 | 1.7923 | 0 | [107, 40] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_000107__494.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4117 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_000108__361 | 0 | 0.0 | 1.8199 | 0 | [107, 41] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_000108__361.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4118 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000110__106 | 0 | 0.0 | 1.98161 | 0 | [107, 46] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_000110__106.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4119 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_000018__111 | 0 | 0.0 | 12.7239 | 0 | [197, 355] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000018__111.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4120 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_000029__689 | 0 | 0.0 | 10.6899 | 0 | [197, 315] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000029__689.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4121 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_000040__695 | 0 | 0.0 | 11.1002 | 4 | [197, 324] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000040__695.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4122 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_000050__246 | 0 | 0.0 | 10.016 | 0 | [197, 293] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000050__246.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4123 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_000058__415 | 0 | 0.0 | 8.46703 | 0 | [197, 243] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000058__415.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4124 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000305__613 | 2 | 0.0 | 8.05353 | 4 | [374, 202] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000305__613.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4125 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000317__985 | 0 | 0.0 | 11.4826 | 0 | [374, 309] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000317__985.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4126 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000327__417 | 1 | 0.0 | 10.4418 | 0 | [374, 277] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000327__417.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4127 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_000337__849 | 0 | 0.0 | 9.6749 | 0 | [374, 253] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000337__849.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4128 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000351__910 | 0 | 0.0 | 14.1706 | 0 | [374, 392] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000351__910.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4129 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000206__875 | 0 | 0.0 | 9.84195 | 0 | [372, 258] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_000206__875.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4130 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000221__870 | 4 | 0.0 | 15.3997 | 4 | [372, 429] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_000221__870.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4131 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000236__783 | 0 | 0.0 | 14.0683 | 0 | [372, 389] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_000236__783.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4132 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_000246__335 | 0 | 0.0 | 10.6677 | 0 | [372, 284] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_000246__335.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4133 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000257__560 | 0 | 0.0 | 10.5736 | 0 | [372, 280] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_000257__560.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4134 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_000536__999 | 0 | 0.0 | 8.1674 | 0 | [66, 200] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_000536__999.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4135 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231228_000544__683 | 0 | 0.0 | 7.37635 | 0 | [66, 180] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_000544__683.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4136 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_000553__108 | 0 | 0.0 | 9.61419 | 0 | [66, 238] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_000553__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4137 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_000603__759 | 0 | 0.0 | 9.35721 | 0 | [66, 231] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_000603__759.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4138 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_000611__703 | 0 | 0.0 | 8.12283 | 0 | [66, 199] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_000611__703.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4139 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000514__108 | 0 | 0.0 | 3.09153 | 0 | [107, 63] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_000514__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4140 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000519__441 | 1 | 0.0 | 4.95916 | 0 | [107, 112] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_000519__441.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4141 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000522__628 | 0 | 0.0 | 2.98089 | 0 | [107, 60] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_000522__628.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4142 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000525__174 | 0 | 0.0 | 2.93668 | 0 | [107, 59] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_000525__174.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4143 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_000528__325 | 0 | 0.0 | 3.01248 | 0 | [107, 61] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_000528__325.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4144 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_000404__562 | 0 | 0.0 | 12.8939 | 0 | [197, 280] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000404__562.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4145 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_000422__471 | 1 | 0.0 | 18.1339 | 0 | [197, 432] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000422__471.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4146 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_000439__627 | 0 | 0.0 | 16.4079 | 0 | [197, 389] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000439__627.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4147 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_000452__109 | 0 | 0.0 | 12.8866 | 0 | [197, 301] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000452__109.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4148 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_000510__185 | 3 | 0.0 | 18.6899 | 4 | [197, 446] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000510__185.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4149 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000729__368 | 0 | 0.0 | 11.2689 | 0 | [374, 234] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000729__368.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4150 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000759__730 | 1 | 0.0 | 29.9657 | 0 | [374, 687] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000759__730.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4151 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000815__481 | 0 | 0.0 | 15.8855 | 0 | [374, 348] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000815__481.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4152 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_000827__779 | 0 | 0.0 | 11.8608 | 0 | [374, 249] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000827__779.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4153 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_000841__816 | 0 | 0.0 | 13.7785 | 0 | [374, 296] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_000841__816.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4154 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000625__822 | 0 | 0.0 | 13.5743 | 0 | [372, 291] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_000625__822.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4155 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000640__288 | 0 | 0.0 | 15.1864 | 0 | [372, 331] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_000640__288.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4156 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000654__863 | 0 | 0.0 | 14.2143 | 0 | [372, 307] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_000654__863.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4157 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000704__981 | 0 | 0.0 | 9.8181 | 0 | [372, 198] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_000704__981.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4158 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_000718__526 | 1 | 0.0 | 13.5964 | 0 | [372, 292] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_000718__526.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4159 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_122745__870 | 0 | 0.0 | 13.6263 | 0 | [66, 249] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122745__870.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4160 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_122758__363 | 0 | 0.0 | 13.1537 | 0 | [66, 240] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_122758__363.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4161 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_013550__301 | 0 | 0.0 | 13.0602 | 0 | [66, 238] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_013550__301.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4162 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_122728__714 | 0 | 0.0 | 3.1303 | 0 | [107, 46] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122728__714.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4163 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_122731__658 | 0 | 0.0 | 3.01569 | 0 | [107, 44] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_122731__658.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4164 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013537__276 | 0 | 0.0 | 7.86999 | 0 | [107, 136] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_013537__276.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4165 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_122701__732 | 1 | 0.0 | 11.6019 | 4 | [197, 196] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_122701__732.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4166 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_122724__194 | 1 | 0.0 | 23.5279 | 4 | [197, 417] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_122724__194.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4167 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_013529__736 | 3 | 0.0 | 31.2166 | 4 | [197, 397] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013529__736.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4168 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_122922__426 | 0 | 0.0 | 19.9279 | 0 | [374, 330] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122922__426.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4169 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_122941__992 | 0 | 0.0 | 18.6726 | 0 | [374, 307] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_122941__992.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4170 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013632__726 | 1 | 0.0 | 19.7644 | 4 | [374, 326] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013632__726.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4171 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_122845__715 | 0 | 0.0 | 18.8985 | 0 | [372, 311] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122845__715.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4172 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_122902__306 | 0 | 0.0 | 16.9156 | 0 | [372, 275] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_122902__306.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4173 | Apple-MacBook-Pro-M1 | keep_only_names | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013612__584 | 0 | 0.0 | 22.1835 | 0 | [372, 370] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_013612__584.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4174 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_122745__857 | 1 | 0.0 | 58.8855 | 0 | [71, 344] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_122745__857.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4175 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_122826__427 | 5 | 0.0 | 40.5483 | 4 | [71, 237] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_122826__427.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4176 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_122849__318 | 0 | 0.0 | 22.5282 | 0 | [71, 126] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_122849__318.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4177 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_153501__743 | 5 | 0.0 | 50.2937 | 4 | [71, 295] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_153501__743.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4178 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_153539__359 | 1 | 0.0 | 37.5114 | 0 | [71, 205] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_153539__359.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4179 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_122614__446 | 0 | 0.0 | 12.1889 | 0 | [110, 53] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_122614__446.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4180 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_122630__103 | 0 | 0.0 | 16.778 | 0 | [110, 80] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_122630__103.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4181 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_122646__380 | 0 | 0.0 | 16.1973 | 0 | [110, 74] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_122646__380.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4182 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_153343__111 | 0 | 0.0 | 9.55022 | 0 | [110, 40] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_153343__111.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4183 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_153410__845 | 5 | 0.0 | 27.217 | 4 | [110, 149] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_153410__845.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4184 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_122353__268 | 0 | 0.0 | 64.7492 | 0 | [200, 305] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122353__268.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4185 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_122441__809 | 0 | 0.0 | 48.1443 | 0 | [200, 244] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122441__809.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4186 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_122601__397 | 5 | 0.0 | 79.7252 | 4 | [200, 419] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_122601__397.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4187 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_153234__750 | 1 | 0.0 | 55.6298 | 0 | [200, 305] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_153234__750.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4188 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_153333__893 | 0 | 0.0 | 59.5287 | 0 | [200, 328] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_153333__893.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4189 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_123146__804 | 0 | 0.0 | 34.6543 | 4 | [384, 151] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123146__804.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4190 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_123156__109 | 0 | 0.0 | 10.2473 | 0 | [384, 4] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123156__109.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4191 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_123326__218 | 5 | 0.0 | 90.109 | 4 | [384, 477] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123326__218.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4192 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_153751__297 | 0 | 0.0 | 10.433 | 0 | [384, 4] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_153751__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4193 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_153824__971 | 0 | 0.0 | 33.2001 | 0 | [384, 141] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_153824__971.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4194 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_122951__976 | 3 | 0.0 | 61.7902 | 4 | [382, 312] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_122951__976.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4195 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_123037__575 | 1 | 0.0 | 45.7293 | 0 | [382, 217] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_123037__575.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4196 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_123111__662 | 5 | 0.0 | 34.2941 | 4 | [382, 149] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_123111__662.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4197 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_153612__393 | 1 | 0.0 | 32.7857 | 0 | [382, 138] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_153612__393.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4198 | Apple-MacBook-Pro-M1 | keep_only_names | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_153740__682 | 1 | 0.0 | 88.0905 | 0 | [382, 457] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_153740__682.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4199 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_122038__541 | 3 | 0.0 | 6.53778 | 4 | [75, 159] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_122038__541.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4200 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_122045__593 | 0 | 0.0 | 7.10382 | 4 | [75, 174] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_122045__593.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4201 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_013414__238 | 0 | 0.0 | 6.64198 | 0 | [75, 161] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_013414__238.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4202 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_122029__210 | 0 | 0.0 | 4.77329 | 0 | [116, 108] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_122029__210.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4203 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_122031__694 | 1 | 0.0 | 2.35263 | 0 | [116, 44] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_122031__694.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4204 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013407__599 | 3 | 0.0 | 3.68504 | 4 | [116, 79] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_013407__599.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4205 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_122015__574 | 0 | 0.0 | 20.8095 | 0 | [206, 343] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122015__574.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4206 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_122024__654 | 1 | 0.0 | 9.13778 | 0 | [206, 207] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122024__654.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4207 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_013403__535 | 0 | 0.0 | 17.3026 | 0 | [206, 263] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013403__535.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4208 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_122134__140 | 1 | 0.0 | 11.6293 | 0 | [383, 245] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122134__140.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4209 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_122147__304 | 1 | 0.0 | 11.9915 | 0 | [383, 253] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122147__304.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4210 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013443__369 | 1 | 0.0 | 14.1849 | 0 | [383, 307] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013443__369.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4211 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_122111__956 | 5 | 0.0 | 13.0121 | 4 | [381, 279] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_122111__956.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4212 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_122123__292 | 0 | 0.0 | 11.8215 | 0 | [381, 250] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_122123__292.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4213 | Apple-MacBook-Pro-M1 | keep_only_names | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013429__398 | 1 | 0.0 | 14.7892 | 0 | [381, 322] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_013429__398.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4214 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231214_075323__883 | 0 | 0.0 | 12.0569 | 0 | [69, 359] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_075323__883.json | 25.0 | missing | missing | missing | |
| 4215 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_114216__269 | 5 | 0.0 | 5.59987 | 4 | [73, 176] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_114216__269.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4216 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_114227__706 | 1 | 0.0 | 11.4762 | 0 | [73, 372] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_114227__706.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4217 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231227_011645__523 | 5 | 0.0 | 8.73798 | 4 | [73, 279] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_011645__523.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4218 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075311__717 | 1 | 0.0 | 8.9848 | 0 | [98, 258] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_075311__717.json | 55.0 | missing | missing | missing | |
| 4219 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_114205__239 | 0 | 0.0 | 2.5545 | 0 | [114, 67] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_114205__239.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4220 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_114210__657 | 5 | 0.0 | 5.19515 | 4 | [114, 157] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_114210__657.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4221 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_011636__194 | 5 | 0.0 | 4.22062 | 4 | [114, 123] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_011636__194.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4222 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_075302__125 | 0 | 0.0 | 12.4073 | 0 | [188, 331] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075302__125.json | 25.0 | missing | missing | missing | |
| 4223 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114155__854 | 4 | 0.0 | 15.0408 | 4 | [204, 288] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114155__854.json | 95.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4224 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114202__917 | 3 | 0.0 | 7.19631 | 4 | [204, 208] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114202__917.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4225 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_011631__176 | 5 | 0.0 | 14.6013 | 4 | [204, 286] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011631__176.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4226 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_075409__680 | 0 | 0.0 | 14.6656 | 0 | [11, 405] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075409__680.json | 0.0 | missing | missing | missing | |
| 4227 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_114303__523 | 0 | 0.0 | 2.69387 | 0 | [381, 33] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114303__523.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4228 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_114311__595 | 5 | 0.0 | 8.52521 | 4 | [381, 222] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114311__595.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4229 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_011656__496 | 0 | 0.0 | 2.55718 | 0 | [381, 28] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011656__496.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4230 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_075354__652 | 0 | 0.0 | 21.1672 | 0 | [369, 493] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_075354__652.json | 50.0 | missing | missing | missing | |
| 4231 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_114246__271 | 5 | 0.0 | 7.84882 | 4 | [379, 201] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_114246__271.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4232 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_114300__744 | 5 | 0.0 | 13.5753 | 4 | [379, 383] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_114300__744.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4233 | Apple-MacBook-Pro-M1 | keep_only_names | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_011653__518 | 5 | 0.0 | 8.46281 | 4 | [379, 219] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_011653__518.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4234 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231214_080220__844 | 0 | 0.0 | 10.6957 | 0 | [69, 319] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__InJulia__1SHOT__20231214_080220__844.json | 25.0 | missing | missing | missing | |
| 4235 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_115903__346 | 3 | 0.0 | 10.3201 | 4 | [72, 184] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__InJulia__1SHOT__20231225_115903__346.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4236 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_115913__589 | 3 | 0.0 | 9.80038 | 4 | [72, 174] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__InJulia__1SHOT__20231225_115913__589.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4237 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231227_012452__176 | 0 | 0.0 | 10.4962 | 0 | [72, 186] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__InJulia__1SHOT__20231227_012452__176.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4238 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_080209__449 | 0 | 0.0 | 6.83198 | 0 | [98, 193] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_080209__449.json | 25.0 | missing | missing | missing | |
| 4239 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115843__536 | 0 | 0.0 | 4.54659 | 0 | [111, 68] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_115843__536.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4240 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115853__191 | 0 | 0.0 | 9.97366 | 0 | [111, 172] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_115853__191.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4241 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_012442__440 | 3 | 0.0 | 9.19185 | 4 | [111, 156] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_012442__440.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4242 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_080202__315 | 5 | 0.0 | 12.7731 | 4 | [188, 341] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080202__315.json | 100.0 | missing | missing | missing | |
| 4243 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_115820__136 | 3 | 0.0 | 29.4197 | 4 | [201, 341] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115820__136.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4244 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_115838__456 | 0 | 0.0 | 17.9023 | 0 | [201, 303] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115838__456.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4245 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_012432__958 | 0 | 0.0 | 41.7004 | 0 | [201, 566] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012432__958.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4246 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_080315__272 | 0 | 0.0 | 20.2466 | 0 | [11, 549] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080315__272.json | 25.0 | missing | missing | missing | |
| 4247 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_120050__959 | 0 | 0.0 | 39.4556 | 0 | [375, 645] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_120050__959.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4248 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_120114__453 | 0 | 0.0 | 23.4365 | 0 | [375, 370] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_120114__453.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4249 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_012521__951 | 0 | 0.0 | 12.7635 | 0 | [375, 178] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012521__951.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4250 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_080255__610 | 0 | 0.0 | 24.2477 | 0 | [369, 572] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_080255__610.json | 50.0 | missing | missing | missing | |
| 4251 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115950__570 | 0 | 0.0 | 12.869 | 0 | [372, 181] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_115950__570.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4252 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_120010__858 | 0 | 0.0 | 19.9741 | 0 | [372, 309] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_120010__858.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4253 | Apple-MacBook-Pro-M1 | keep_only_names | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_012508__515 | 1 | 0.0 | 15.6459 | 0 | [372, 230] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_012508__515.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4254 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_122243__492 | 0 | 0.0 | 15.4193 | 0 | [65, 592] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_122243__492.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4255 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_122244__859 | 0 | 0.0 | 1.40873 | 0 | [65, 48] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_122244__859.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4256 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_013451__664 | 0 | 0.0 | 2.35911 | 0 | [65, 86] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_013451__664.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4257 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_122211__324 | 0 | 0.0 | 18.138 | 0 | [102, 681] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_122211__324.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4258 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_122227__818 | 0 | 0.0 | 16.159 | 0 | [102, 611] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_122227__818.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4259 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_013449__993 | 0 | 0.0 | 1.39127 | 0 | [102, 43] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_013449__993.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4260 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_122151__635 | 0 | 0.0 | 4.41918 | 0 | [190, 8] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122151__635.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4261 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_122153__481 | 0 | 0.0 | 1.6672 | 0 | [190, 45] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122153__481.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4262 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_013448__498 | 0 | 0.0 | 4.41691 | 0 | [190, 8] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013448__498.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4263 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_122329__619 | 0 | 0.0 | 1.21424 | 0 | [354, 1] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122329__619.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4264 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_122330__643 | 0 | 0.0 | 1.22892 | 0 | [354, 1] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122330__643.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4265 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_013458__702 | 0 | 0.0 | 1.21667 | 0 | [354, 1] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013458__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4266 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_122325__667 | 0 | 0.0 | 5.62783 | 0 | [351, 173] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_122325__667.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4267 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_122328__692 | 0 | 0.0 | 2.45268 | 0 | [351, 52] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_122328__692.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4268 | Apple-MacBook-Pro-M1 | keep_only_names | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_013457__106 | 0 | 0.0 | 5.21891 | 0 | [351, 157] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_013457__106.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4269 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231214_080352__896 | 0 | 0.0 | 10.5665 | 0 | [69, 316] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_080352__896.json | 25.0 | missing | missing | missing | |
| 4270 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_120347__377 | 3 | 0.0 | 34.5567 | 4 | [80, 267] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_120347__377.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4271 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_120429__339 | 3 | 0.0 | 41.7046 | 4 | [80, 325] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_120429__339.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4272 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231227_012725__144 | 5 | 0.0 | 29.3991 | 4 | [80, 223] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_012725__144.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4273 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_080342__887 | 1 | 0.0 | 9.60226 | 0 | [98, 276] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_080342__887.json | 55.0 | missing | missing | missing | |
| 4274 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_120305__803 | 3 | 0.0 | 18.9746 | 4 | [119, 135] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_120305__803.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4275 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_120313__214 | 3 | 0.0 | 7.3003 | 4 | [119, 39] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_120313__214.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4276 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_012656__661 | 0 | 0.0 | 43.8257 | 4 | [119, 333] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_012656__661.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4277 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_080332__517 | 2 | 0.0 | 16.8668 | 4 | [188, 455] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080332__517.json | 85.0 | missing | missing | missing | |
| 4278 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_120211__663 | 3 | 0.0 | 57.8374 | 4 | [209, 256] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_120211__663.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4279 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_120246__100 | 3 | 0.0 | 34.5705 | 4 | [209, 244] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_120246__100.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4280 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_012612__649 | 0 | 0.0 | 51.2141 | 0 | [209, 210] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012612__649.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4281 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_080421__824 | 0 | 0.0 | 1.33774 | 0 | [11, 33] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080421__824.json | 0.0 | missing | missing | missing | |
| 4282 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_120701__684 | 0 | 0.0 | 22.2076 | 0 | [383, 115] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_120701__684.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4283 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_120751__362 | 3 | 0.0 | 49.8922 | 4 | [383, 333] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_120751__362.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4284 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_012833__525 | 0 | 0.0 | 37.169 | 0 | [383, 232] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012833__525.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4285 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_080420__581 | 1 | 0.0 | 16.8592 | 0 | [369, 380] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_080420__581.json | 55.0 | missing | missing | missing | |
| 4286 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_120551__192 | 3 | 0.0 | 46.0477 | 4 | [380, 303] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_120551__192.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4287 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_120638__495 | 3 | 0.0 | 47.056 | 4 | [380, 311] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_120638__495.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4288 | Apple-MacBook-Pro-M1 | keep_only_names | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_012756__237 | 3 | 0.0 | 30.4719 | 4 | [380, 180] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_012756__237.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4289 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121615__294 | 1 | 0.0 | 15.8222 | 0 | [75, 267] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_121615__294.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4290 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_121627__962 | 5 | 0.0 | 11.8505 | 4 | [75, 197] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_121627__962.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4291 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231227_013215__194 | 0 | 0.0 | 13.6522 | 0 | [75, 228] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_013215__194.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4292 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_121545__792 | 3 | 0.0 | 13.8565 | 4 | [116, 227] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_121545__792.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4293 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_121559__188 | 1 | 0.0 | 14.0693 | 0 | [116, 231] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_121559__188.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4294 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013201__514 | 0 | 0.0 | 13.0467 | 0 | [116, 212] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_013201__514.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4295 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_121515__627 | 1 | 0.0 | 27.8238 | 0 | [206, 294] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_121515__627.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4296 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_121531__576 | 0 | 0.0 | 16.0436 | 0 | [206, 250] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_121531__576.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4297 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_013148__757 | 0 | 0.0 | 23.1094 | 0 | [206, 226] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013148__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4298 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_121749__968 | 1 | 0.0 | 22.678 | 0 | [383, 336] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121749__968.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4299 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_121811__747 | 1 | 0.0 | 21.6739 | 0 | [383, 319] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_121811__747.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4300 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013255__151 | 5 | 0.0 | 15.1888 | 4 | [383, 209] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013255__151.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4301 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_121705__838 | 5 | 0.0 | 15.5506 | 4 | [381, 216] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121705__838.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4302 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_121727__224 | 0 | 0.0 | 21.0642 | 0 | [381, 309] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_121727__224.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4303 | Apple-MacBook-Pro-M1 | keep_only_names | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013240__198 | 1 | 0.0 | 24.8247 | 0 | [381, 370] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_013240__198.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4304 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231214_080055__851 | 0 | 0.0 | 10.6618 | 0 | [69, 318] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_080055__851.json | 25.0 | missing | missing | missing | |
| 4305 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231225_115719__293 | 0 | 0.0 | 4.97764 | 0 | [75, 284] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_115719__293.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4306 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_115725__625 | 0 | 0.0 | 5.11245 | 0 | [75, 291] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_115725__625.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4307 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231227_012339__171 | 0 | 0.0 | 5.48043 | 0 | [75, 311] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_012339__171.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4308 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_080044__282 | 5 | 0.0 | 6.17026 | 4 | [98, 172] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_080044__282.json | 100.0 | missing | missing | missing | |
| 4309 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115708__269 | 0 | 0.0 | 1.37596 | 0 | [112, 67] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_115708__269.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4310 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_115714__101 | 0 | 0.0 | 6.07426 | 0 | [112, 340] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_115714__101.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4311 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_012334__872 | 0 | 0.0 | 4.82818 | 0 | [112, 267] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_012334__872.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4312 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_080038__288 | 1 | 0.0 | 18.1234 | 0 | [188, 489] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080038__288.json | 55.0 | missing | missing | missing | |
| 4313 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_115704__317 | 0 | 0.0 | 6.11064 | 0 | [197, 173] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115704__317.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4314 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_115707__681 | 0 | 0.0 | 3.40053 | 0 | [197, 168] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_115707__681.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4315 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_012329__167 | 0 | 0.0 | 11.1561 | 0 | [197, 439] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_012329__167.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4316 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_080149__303 | 0 | 0.0 | 22.2493 | 0 | [11, 599] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080149__303.json | 25.0 | missing | missing | missing | |
| 4317 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_115746__487 | 0 | 0.0 | 5.7509 | 0 | [362, 254] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115746__487.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4318 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_115751__802 | 0 | 0.0 | 5.30094 | 0 | [362, 233] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115751__802.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4319 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_012351__545 | 0 | 0.0 | 5.77068 | 0 | [362, 257] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012351__545.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4320 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_080127__188 | 0 | 0.0 | 23.62 | 0 | [369, 555] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_080127__188.json | 50.0 | missing | missing | missing | |
| 4321 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_115735__923 | 0 | 0.0 | 4.59024 | 0 | [360, 188] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_115735__923.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4322 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_115740__284 | 0 | 0.0 | 4.94026 | 0 | [360, 213] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_115740__284.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4323 | Apple-MacBook-Pro-M1 | keep_only_names | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_012345__392 | 0 | 0.0 | 5.32464 | 0 | [360, 234] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_012345__392.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4324 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231214_075436__541 | 0 | 0.0 | 9.90712 | 0 | [69, 295] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_075436__541.json | 25.0 | missing | missing | missing | |
| 4325 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_114412__643 | 2 | 0.0 | 8.55479 | 4 | [75, 274] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_114412__643.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4326 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_114417__312 | 1 | 0.0 | 4.7001 | 0 | [75, 144] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_114417__312.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4327 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_011727__392 | 1 | 0.0 | 6.60204 | 0 | [75, 208] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_011727__392.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4328 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_075426__293 | 2 | 0.0 | 7.26552 | 4 | [98, 205] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_075426__293.json | 85.0 | missing | missing | missing | |
| 4329 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_114355__290 | 1 | 0.0 | 8.65511 | 0 | [116, 272] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_114355__290.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4330 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_114403__886 | 2 | 0.0 | 7.69726 | 4 | [116, 239] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_114403__886.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4331 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_011720__313 | 1 | 0.0 | 6.00522 | 0 | [116, 182] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_011720__313.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4332 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_075419__423 | 1 | 0.0 | 10.0351 | 0 | [188, 263] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075419__423.json | 55.0 | missing | missing | missing | |
| 4333 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114332__505 | 5 | 0.0 | 20.7529 | 4 | [206, 472] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114332__505.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4334 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114346__137 | 5 | 0.0 | 14.3472 | 4 | [206, 439] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114346__137.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4335 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_011714__123 | 5 | 0.0 | 18.1528 | 4 | [206, 394] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011714__123.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4336 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_075516__789 | 0 | 0.0 | 10.1656 | 0 | [11, 283] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075516__789.json | 0.0 | missing | missing | missing | |
| 4337 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_114456__603 | 2 | 0.0 | 9.28117 | 4 | [383, 246] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114456__603.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4338 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_114506__915 | 1 | 0.0 | 9.7461 | 0 | [383, 261] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_114506__915.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4339 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_011755__640 | 1 | 0.0 | 9.16901 | 0 | [383, 241] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_011755__640.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4340 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_075506__423 | 1 | 0.0 | 19.4093 | 0 | [369, 447] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_075506__423.json | 55.0 | missing | missing | missing | |
| 4341 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_114439__588 | 1 | 0.0 | 10.4916 | 0 | [381, 284] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_114439__588.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4342 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_114447__151 | 5 | 0.0 | 7.1925 | 4 | [381, 179] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_114447__151.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4343 | Apple-MacBook-Pro-M1 | keep_only_names | starling-lm:latest | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_011746__925 | 0 | 0.0 | 18.9519 | 0 | [381, 544] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_011746__925.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4344 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231214_075539__868 | 0 | 0.0 | 9.75106 | 0 | [69, 291] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_075539__868.json | 25.0 | missing | missing | missing | |
| 4345 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_114804__476 | 1 | 0.0 | 70.6903 | 0 | [71, 536] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_114804__476.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4346 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_114848__700 | 0 | 0.0 | 43.5071 | 0 | [71, 328] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_114848__700.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4347 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231227_011947__677 | 1 | 0.0 | 58.5035 | 0 | [71, 441] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_011947__677.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4348 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_075529__292 | 0 | 0.0 | 7.70736 | 0 | [98, 220] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_075529__292.json | 25.0 | missing | missing | missing | |
| 4349 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_114629__365 | 5 | 0.0 | 7.10941 | 4 | [110, 36] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_114629__365.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4350 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_114654__221 | 1 | 0.0 | 23.8852 | 0 | [110, 169] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_114654__221.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4351 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_011848__599 | 5 | 0.0 | 14.3099 | 4 | [110, 93] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_011848__599.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4352 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_075522__503 | 0 | 0.0 | 5.57692 | 0 | [188, 130] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_075522__503.json | 0.0 | missing | missing | missing | |
| 4353 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114552__743 | 1 | 0.0 | 45.6022 | 4 | [200, 139] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114552__743.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4354 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_114622__102 | 5 | 0.0 | 30.5297 | 4 | [200, 204] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_114622__102.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4355 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_011834__857 | 0 | 0.0 | 38.6292 | 0 | [200, 92] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_011834__857.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4356 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_075622__350 | 0 | 0.0 | 16.2484 | 0 | [11, 445] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_075622__350.json | 50.0 | missing | missing | missing | |
| 4357 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_115110__626 | 1 | 0.0 | 15.4574 | 0 | [384, 59] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115110__626.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4358 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_115209__114 | 5 | 0.0 | 59.1356 | 4 | [384, 388] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_115209__114.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4359 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_012139__677 | 0 | 0.0 | 63.0962 | 0 | [384, 416] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_012139__677.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4360 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_075606__266 | 1 | 0.0 | 19.3083 | 0 | [369, 444] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_075606__266.json | 55.0 | missing | missing | missing | |
| 4361 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115020__152 | 5 | 0.0 | 36.9208 | 4 | [382, 222] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_115020__152.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4362 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_115055__164 | 0 | 0.0 | 34.2584 | 0 | [382, 202] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_115055__164.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4363 | Apple-MacBook-Pro-M1 | keep_only_names | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_012035__154 | 0 | 0.0 | 48.2187 | 0 | [382, 306] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/keep_only_names/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_012035__154.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4364 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231214_081254__785 | 0 | 0.0 | 16.7285 | 0 | [70, 493] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_081254__785.json | 25.0 | missing | missing | missing | |
| 4365 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | InJulia | 1SHOT | false | false | 5 | 20231225_125019__725 | 0 | 0.0 | 25.712 | 0 | [78, 473] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_125019__725.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4366 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_125047__539 | 1 | 0.0 | 28.2169 | 2 | [78, 518] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_125047__539.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4367 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231227_015034__563 | 0 | 0.0 | 24.8472 | 0 | [78, 450] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_015034__563.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4368 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_081237__359 | 0 | 0.0 | 18.9857 | 0 | [99, 544] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_081237__359.json | 50.0 | missing | missing | missing | |
| 4369 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_124935__414 | 0 | 0.0 | 18.7463 | 0 | [116, 338] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_124935__414.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4370 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_124953__383 | 0 | 0.0 | 17.3474 | 0 | [116, 312] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_124953__383.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4371 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_014809__338 | 0 | 0.0 | 12.9204 | 1 | [116, 227] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_014809__338.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4372 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_081218__584 | 0 | 0.0 | 20.6979 | 0 | [187, 558] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_081218__584.json | 25.0 | missing | missing | missing | |
| 4373 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_124855__288 | 0 | 0.0 | 42.0522 | 0 | [205, 564] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_124855__288.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4374 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_124917__503 | 0 | 0.0 | 22.041 | 0 | [205, 380] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_124917__503.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4375 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_014756__213 | 0 | 0.0 | 42.1065 | 0 | [205, 568] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_014756__213.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4376 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_081406__119 | 0 | 0.0 | 21.1032 | 0 | [11, 570] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_081406__119.json | 50.0 | missing | missing | missing | |
| 4377 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_125232__993 | 0 | 0.0 | 20.8633 | 3 | [381, 326] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_125232__993.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 4378 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_125307__928 | 0 | 0.0 | 33.9336 | 0 | [381, 546] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_125307__928.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4379 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_015125__150 | 0 | 0.0 | 23.9987 | 0 | [381, 379] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_015125__150.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4380 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_081344__329 | 0 | 0.0 | 29.2769 | 0 | [370, 694] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_081344__329.json | 25.0 | missing | missing | missing | |
| 4381 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_125156__595 | 0 | 0.0 | 30.4516 | 0 | [378, 494] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_125156__595.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4382 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_125211__963 | 0 | 0.0 | 14.819 | 0 | [378, 217] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_125211__963.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4383 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_015101__839 | 0 | 0.0 | 26.2985 | 0 | [378, 419] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_015101__839.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4384 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231214_081457__126 | 0 | 0.0 | 13.8492 | 0 | [70, 411] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_081457__126.json | 25.0 | missing | missing | missing | |
| 4385 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_125408__196 | 0 | 0.0 | 20.3985 | 0 | [52, 377] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_125408__196.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4386 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_125448__622 | 0 | 0.0 | 39.8091 | 0 | [52, 720] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_125448__622.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4387 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_081443__680 | 0 | 0.0 | 18.3441 | 0 | [99, 526] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_081443__680.json | 0.0 | missing | missing | missing | |
| 4388 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_125323__492 | 0 | 0.0 | 1.60976 | 0 | [53, 20] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_125323__492.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4389 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_125348__338 | 0 | 0.0 | 24.6256 | 0 | [53, 450] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_125348__338.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4390 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_081424__916 | 0 | 0.0 | 18.6604 | 0 | [187, 504] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_081424__916.json | 0.0 | missing | missing | missing | |
| 4391 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_125319__140 | 0 | 0.0 | 12.3596 | 0 | [80, 28] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_125319__140.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4392 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_125321__916 | 0 | 0.0 | 2.00512 | 0 | [80, 23] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_125321__916.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4393 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_081606__122 | 0 | 0.0 | 23.9125 | 0 | [11, 640] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_081606__122.json | 50.0 | missing | missing | missing | |
| 4394 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_125527__297 | 0 | 0.0 | 1.37585 | 0 | [70, 11] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_125527__297.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4395 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_125530__311 | 0 | 0.0 | 3.09635 | 0 | [70, 43] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_125530__311.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4396 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_081542__814 | 0 | 0.0 | 22.9211 | 0 | [370, 537] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_081542__814.json | 25.0 | missing | missing | missing | |
| 4397 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_125522__438 | 0 | 0.0 | 3.60002 | 0 | [67, 54] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_125522__438.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4398 | Apple-MacBook-Pro-M1 | pig_latinify | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_125525__633 | 0 | 0.0 | 3.64621 | 0 | [67, 55] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_125525__633.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4399 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_133915__199 | 0 | 0.0 | 50.9174 | 0 | [67, 308] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_133915__199.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4400 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_134006__603 | 0 | 0.0 | 50.6024 | 0 | [67, 306] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_134006__603.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4401 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_020752__373 | 0 | 0.0 | 54.759 | 0 | [67, 331] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_020752__373.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4402 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_133700__108 | 0 | 0.0 | 70.3717 | 0 | [108, 422] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_133700__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4403 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_133823__159 | 4 | 0.0 | 82.8659 | 3 | [108, 498] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_133823__159.json | 88.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 4404 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_020656__422 | 3 | 0.0 | 47.845 | 2 | [108, 283] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_020656__422.json | 77.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4405 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_133508__568 | 2 | 0.0 | 83.0887 | 2 | [197, 322] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_133508__568.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4406 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_133550__551 | 0 | 0.0 | 41.6344 | 0 | [197, 228] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_133550__551.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4407 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_020609__962 | 0 | 0.0 | 79.869 | 0 | [197, 317] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_020609__962.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4408 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_134740__983 | 0 | 0.0 | 71.6321 | 0 | [396, 376] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_134740__983.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4409 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_134848__152 | 2 | 0.0 | 68.5964 | 2 | [396, 358] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_134848__152.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4410 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_021005__235 | 0 | 0.0 | 44.5888 | 0 | [396, 213] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021005__235.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4411 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_134515__641 | 0 | 0.0 | 88.1433 | 0 | [394, 474] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_134515__641.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4412 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_134628__353 | 0 | 0.0 | 71.6973 | 0 | [394, 376] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_134628__353.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4413 | Apple-MacBook-Pro-M1 | pig_latinify | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_020920__515 | 3 | 0.0 | 88.1929 | 2 | [394, 474] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_020920__515.json | 77.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4414 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_022003__367 | 0 | 0.0 | 11.8422 | 0 | [69, 454] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_022003__367.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4415 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_123502__623 | 1 | 0.0 | 10.9055 | 1 | [69, 420] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_123502__623.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4416 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_123513__823 | 0 | 0.0 | 11.7069 | 0 | [69, 450] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_123513__823.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4417 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_123534__678 | 0 | 0.0 | 20.1983 | 0 | [69, 757] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_123534__678.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4418 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_021951__337 | 0 | 0.0 | 13.6045 | 0 | [106, 513] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_021951__337.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4419 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_123422__713 | 0 | 0.0 | 11.5347 | 0 | [106, 437] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_123422__713.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4420 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_123437__851 | 0 | 0.0 | 15.0882 | 0 | [106, 568] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_123437__851.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4421 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_123450__681 | 0 | 0.0 | 13.6626 | 0 | [106, 516] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_123450__681.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4422 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_021938__671 | 0 | 0.0 | 11.2671 | 0 | [193, 284] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021938__671.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4423 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_123340__512 | 0 | 0.0 | 13.131 | 0 | [193, 347] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_123340__512.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4424 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_123357__794 | 0 | 0.0 | 17.1689 | 0 | [193, 621] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_123357__794.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4425 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_123410__248 | 0 | 0.0 | 13.2974 | 0 | [193, 483] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_123410__248.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4426 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_022028__969 | 0 | 0.0 | 11.8747 | 0 | [358, 396] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_022028__969.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4427 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_123652__250 | 0 | 0.0 | 26.2219 | 0 | [358, 882] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123652__250.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4428 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_123704__657 | 0 | 0.0 | 11.9096 | 0 | [358, 398] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123704__657.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4429 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_123719__114 | 0 | 0.0 | 14.6431 | 0 | [358, 494] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_123719__114.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4430 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_022016__963 | 0 | 0.0 | 12.5721 | 0 | [355, 421] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_022016__963.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4431 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_123555__345 | 0 | 0.0 | 21.7296 | 0 | [355, 736] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_123555__345.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4432 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_123610__556 | 0 | 0.0 | 14.6353 | 0 | [355, 494] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_123610__556.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4433 | Apple-MacBook-Pro-M1 | pig_latinify | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_123626__310 | 0 | 0.0 | 15.7251 | 0 | [355, 532] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_123626__310.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4434 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | InJulia | 1SHOT | true | false | 5 | 20231214_080517__995 | 0 | 0.0 | 19.034 | 0 | [70, 557] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__InJulia__1SHOT__20231214_080517__995.json | 25.0 | missing | missing | missing | |
| 4435 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | InJulia | 1SHOT | false | false | 5 | 20231225_122443__739 | 0 | 0.0 | 16.4281 | 0 | [70, 487] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__InJulia__1SHOT__20231225_122443__739.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4436 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_122506__948 | 0 | 0.0 | 22.1989 | 0 | [1, 662] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__InJulia__1SHOT__20231225_122506__948.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4437 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | InJulia | 1SHOT | true | false | 5 | 20231227_013748__213 | 0 | 0.0 | 14.8003 | 0 | [70, 446] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__InJulia__1SHOT__20231227_013748__213.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4438 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_080458__473 | 0 | 0.0 | 16.9059 | 0 | [99, 487] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_080458__473.json | 50.0 | missing | missing | missing | |
| 4439 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_122416__983 | 0 | 0.0 | 12.0738 | 0 | [99, 351] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_122416__983.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4440 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_122427__630 | 0 | 0.0 | 10.9186 | 0 | [1, 338] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_122427__630.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4441 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_013733__727 | 0 | 0.0 | 8.68223 | 0 | [99, 253] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_013733__727.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4442 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_080441__516 | 1 | 0.0 | 19.7883 | 1 | [187, 534] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080441__516.json | 61.25 | missing | missing | missing | |
| 4443 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_122351__855 | 0 | 0.0 | 20.2376 | 0 | [205, 407] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122351__855.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4444 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_122404__302 | 0 | 0.0 | 13.2071 | 0 | [1, 394] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122404__302.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4445 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_013724__814 | 0 | 0.0 | 25.7546 | 0 | [205, 569] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013724__814.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4446 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_080609__775 | 0 | 0.0 | 14.5555 | 0 | [11, 401] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080609__775.json | 25.0 | missing | missing | missing | |
| 4447 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_122656__550 | 0 | 0.0 | 22.8302 | 0 | [11, 615] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122656__550.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4448 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_122729__119 | 0 | 0.0 | 33.0791 | 0 | [1, 866] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_122729__119.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4449 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_013840__225 | 0 | 0.0 | 23.4498 | 0 | [11, 637] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013840__225.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4450 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_080555__922 | 0 | 0.0 | 19.5064 | 0 | [370, 450] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_080555__922.json | 0.0 | missing | missing | missing | |
| 4451 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_122609__122 | 0 | 0.0 | 23.9108 | 0 | [370, 565] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_122609__122.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4452 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_122633__408 | 0 | 0.0 | 23.4272 | 0 | [1, 636] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_122633__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4453 | Apple-MacBook-Pro-M1 | pig_latinify | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_013816__879 | 0 | 0.0 | 28.2876 | 0 | [370, 680] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_013816__879.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4454 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | InJulia | 1SHOT | true | false | 5 | 20231214_081652__286 | 0 | 0.0 | 13.5762 | 0 | [70, 404] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__InJulia__1SHOT__20231214_081652__286.json | 25.0 | missing | missing | missing | |
| 4455 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_125643__922 | 2 | 0.0 | 11.0357 | 2 | [70, 366] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__InJulia__1SHOT__20231225_125643__922.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4456 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | InJulia | 1SHOT | true | false | 5 | 20231227_015705__648 | 0 | 0.0 | 10.3978 | 0 | [70, 341] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__InJulia__1SHOT__20231227_015705__648.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4457 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_081638__251 | 0 | 0.0 | 14.5211 | 1 | [99, 420] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_081638__251.json | 56.25 | missing | missing | missing | |
| 4458 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_125621__660 | 0 | 0.0 | 11.0939 | 0 | [109, 353] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_125621__660.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4459 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_125632__409 | 0 | 0.0 | 10.2177 | 0 | [109, 329] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_125632__409.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4460 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_015654__585 | 0 | 0.0 | 12.6526 | 0 | [109, 408] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_015654__585.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4461 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_081623__485 | 0 | 0.0 | 17.3955 | 0 | [187, 468] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_081623__485.json | 0.0 | missing | missing | missing | |
| 4462 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_125553__638 | 0 | 0.0 | 23.4637 | 0 | [197, 540] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_125553__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4463 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_125610__355 | 0 | 0.0 | 17.1099 | 0 | [197, 524] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_125610__355.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4464 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_015641__304 | 1 | 0.0 | 16.3082 | 2 | [197, 330] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_015641__304.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4465 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_081805__812 | 0 | 0.0 | 31.5993 | 0 | [11, 824] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_081805__812.json | 25.0 | missing | missing | missing | |
| 4466 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_131537__638 | 1 | 0.0 | 15.1512 | 1 | [373, 431] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_131537__638.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4467 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_131553__805 | 0 | 0.0 | 15.8887 | 0 | [373, 459] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_131553__805.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4468 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_015727__982 | 0 | 0.0 | 11.8863 | 0 | [373, 333] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_015727__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4469 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_081734__962 | 0 | 0.0 | 16.6255 | 0 | [370, 374] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_081734__962.json | 0.0 | missing | missing | missing | |
| 4470 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_131513__546 | 0 | 0.0 | 11.9524 | 2 | [370, 339] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_131513__546.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4471 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_131522__120 | 0 | 0.0 | 8.58172 | 0 | [370, 219] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_131522__120.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4472 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_015715__177 | 0 | 0.0 | 10.0699 | 1 | [370, 276] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_015715__177.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4473 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_182130__277 | 0 | 0.0 | 23.7372 | 0 | [70, 447] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182130__277.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4474 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_182154__361 | 0 | 0.0 | 23.9588 | 0 | [70, 460] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182154__361.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4475 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_182210__858 | 0 | 0.0 | 16.2913 | 0 | [70, 311] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182210__858.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4476 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_182021__262 | 0 | 0.0 | 21.6223 | 0 | [109, 412] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182021__262.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4477 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_182046__955 | 0 | 0.0 | 24.9038 | 1 | [109, 468] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182046__955.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4478 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_182106__800 | 1 | 0.0 | 20.1893 | 1 | [109, 383] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182106__800.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4479 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_181931__170 | 0 | 0.0 | 17.6014 | 0 | [197, 305] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181931__170.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4480 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_181944__445 | 0 | 0.0 | 12.6711 | 0 | [197, 223] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181944__445.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4481 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_181959__101 | 0 | 0.0 | 15.0933 | 0 | [197, 271] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_181959__101.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4482 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_182337__434 | 0 | 0.0 | 14.3272 | 0 | [373, 239] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182337__434.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4483 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_182401__335 | 0 | 0.0 | 24.2041 | 1 | [373, 410] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182401__335.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4484 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_182421__468 | 0 | 0.0 | 20.0301 | 0 | [373, 348] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182421__468.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4485 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_182237__431 | 0 | 0.0 | 26.8419 | 1 | [370, 472] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182237__431.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4486 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_182305__300 | 0 | 0.0 | 27.5392 | 1 | [370, 484] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182305__300.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4487 | Apple-MacBook-Pro-M1 | pig_latinify | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_182322__761 | 0 | 0.0 | 16.7266 | 1 | [370, 281] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182322__761.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4488 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231225_135534__374 | 0 | 0.0 | 18.1549 | 0 | [65, 460] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_135534__374.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4489 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_135551__316 | 0 | 0.0 | 16.874 | 1 | [65, 427] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_135551__316.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4490 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_021310__573 | 0 | 0.0 | 15.078 | 0 | [65, 379] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_021310__573.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4491 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_135509__911 | 1 | 0.0 | 14.718 | 1 | [106, 366] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_135509__911.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4492 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_135516__616 | 0 | 0.0 | 6.83943 | 0 | [106, 162] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_135516__616.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4493 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_021255__329 | 1 | 0.0 | 13.2668 | 1 | [106, 327] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_021255__329.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4494 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_135439__515 | 1 | 0.0 | 25.7898 | 1 | [194, 488] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_135439__515.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4495 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_135454__679 | 0 | 0.0 | 14.9856 | 0 | [194, 357] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_135454__679.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4496 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_021242__989 | 1 | 0.0 | 19.6122 | 1 | [194, 337] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021242__989.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4497 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_135825__870 | 0 | 0.0 | 83.5342 | 0 | [373, 1876] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_135825__870.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4498 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_135849__294 | 0 | 0.0 | 24.1658 | 0 | [373, 555] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_135849__294.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4499 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_021349__190 | 0 | 0.0 | 17.856 | 0 | [373, 398] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021349__190.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4500 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_135641__286 | 0 | 0.0 | 19.4596 | 0 | [371, 440] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_135641__286.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4501 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_135701__646 | 0 | 0.0 | 19.9801 | 0 | [371, 453] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_135701__646.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4502 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_021331__324 | 0 | 0.0 | 20.7726 | 0 | [371, 469] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_021331__324.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4503 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_001149__803 | 0 | 0.0 | 19.7486 | 0 | [64, 628] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_001149__803.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4504 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_001205__447 | 0 | 0.0 | 15.501 | 0 | [64, 496] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_001205__447.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4505 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231228_001219__967 | 0 | 0.0 | 14.2462 | 0 | [64, 456] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_001219__967.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4506 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_001238__607 | 0 | 0.0 | 19.2741 | 0 | [64, 614] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_001238__607.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4507 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 5 | 20231228_001255__370 | 0 | 0.0 | 16.5398 | 0 | [64, 529] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_001255__370.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4508 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_001031__376 | 0 | 0.0 | 15.696 | 0 | [105, 490] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_001031__376.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4509 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231228_001044__582 | 0 | 0.0 | 12.6011 | 0 | [105, 393] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_001044__582.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4510 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231228_001056__594 | 0 | 0.0 | 12.0663 | 0 | [105, 376] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_001056__594.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4511 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231228_001111__813 | 0 | 0.0 | 15.1373 | 0 | [105, 473] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_001111__813.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4512 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_001129__319 | 0 | 0.0 | 17.9975 | 0 | [105, 562] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_001129__319.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4513 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_000902__340 | 0 | 0.0 | 20.961 | 0 | [193, 615] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000902__340.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4514 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_000924__705 | 0 | 0.0 | 22.1956 | 0 | [193, 672] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000924__705.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4515 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_000946__318 | 0 | 0.0 | 21.7389 | 0 | [193, 657] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_000946__318.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4516 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_001003__899 | 0 | 0.0 | 16.5697 | 0 | [193, 499] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001003__899.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4517 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_001016__154 | 0 | 0.0 | 12.7873 | 0 | [193, 381] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001016__154.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4518 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231228_001448__237 | 0 | 0.0 | 14.3268 | 0 | [372, 396] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_001448__237.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4519 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_001511__996 | 0 | 0.0 | 22.9715 | 0 | [372, 656] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_001511__996.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4520 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_001537__804 | 0 | 0.0 | 25.8778 | 0 | [372, 740] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_001537__804.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4521 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_001558__800 | 0 | 0.0 | 20.9281 | 0 | [372, 596] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_001558__800.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4522 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_001614__886 | 0 | 0.0 | 16.3526 | 0 | [372, 458] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_001614__886.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4523 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_001314__598 | 0 | 0.0 | 19.498 | 0 | [370, 553] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_001314__598.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4524 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_001333__499 | 0 | 0.0 | 19.1243 | 0 | [370, 542] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_001333__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4525 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_001355__548 | 0 | 0.0 | 21.9058 | 0 | [370, 624] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_001355__548.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4526 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_001415__479 | 0 | 0.0 | 19.2952 | 0 | [370, 547] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_001415__479.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4527 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_001434__518 | 0 | 0.0 | 18.7747 | 0 | [370, 531] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_001434__518.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4528 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_001946__324 | 1 | 0.0 | 22.0675 | 1 | [64, 557] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_001946__324.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4529 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_002010__143 | 0 | 0.0 | 23.3697 | 0 | [64, 589] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_002010__143.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4530 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_002030__930 | 0 | 0.0 | 20.2347 | 0 | [64, 511] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_002030__930.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4531 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_002053__497 | 0 | 0.0 | 22.7949 | 0 | [64, 575] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_002053__497.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4532 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231228_002109__527 | 0 | 0.0 | 15.6146 | 0 | [64, 395] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_002109__527.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4533 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231228_001817__774 | 0 | 0.0 | 13.2522 | 0 | [105, 325] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_001817__774.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4534 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_001840__602 | 0 | 0.0 | 22.6248 | 0 | [105, 560] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_001840__602.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4535 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_001855__815 | 0 | 0.0 | 15.3798 | 0 | [105, 379] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_001855__815.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4536 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_001909__707 | 0 | 0.0 | 13.6625 | 0 | [105, 336] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_001909__707.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4537 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_001924__730 | 0 | 0.0 | 14.9344 | 0 | [105, 368] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_001924__730.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4538 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_001634__251 | 0 | 0.0 | 19.5012 | 0 | [193, 446] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001634__251.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4539 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_001653__153 | 0 | 0.0 | 19.1643 | 0 | [193, 458] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001653__153.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4540 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_001718__757 | 0 | 0.0 | 24.6262 | 0 | [193, 592] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001718__757.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4541 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_001738__492 | 0 | 0.0 | 20.2683 | 0 | [193, 485] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001738__492.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4542 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_001804__973 | 0 | 0.0 | 25.8256 | 0 | [193, 621] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_001804__973.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4543 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_002329__776 | 0 | 0.0 | 21.6645 | 0 | [372, 489] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_002329__776.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4544 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_002357__922 | 0 | 0.0 | 27.1932 | 0 | [372, 621] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_002357__922.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4545 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_002425__734 | 0 | 0.0 | 28.7898 | 0 | [372, 659] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_002425__734.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4546 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_002449__194 | 0 | 0.0 | 23.1292 | 0 | [372, 524] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_002449__194.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4547 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_002512__718 | 0 | 0.0 | 22.944 | 0 | [372, 520] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_002512__718.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4548 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_002140__994 | 0 | 0.0 | 30.617 | 0 | [370, 702] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_002140__994.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4549 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_002157__991 | 1 | 0.0 | 16.9406 | 1 | [370, 374] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_002157__991.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4550 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_002213__100 | 1 | 0.0 | 15.5256 | 1 | [370, 339] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_002213__100.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4551 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_002245__625 | 0 | 0.0 | 32.0772 | 0 | [370, 736] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_002245__625.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4552 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_002307__438 | 0 | 0.0 | 21.9601 | 0 | [370, 496] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_002307__438.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4553 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_123157__297 | 0 | 0.0 | 26.9498 | 0 | [64, 501] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_123157__297.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4554 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_123230__393 | 0 | 0.0 | 32.3207 | 0 | [64, 600] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_123230__393.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4555 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_021814__972 | 0 | 0.0 | 30.7879 | 0 | [64, 570] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_021814__972.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4556 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231226_123106__256 | 0 | 0.0 | 21.195 | 0 | [105, 386] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_123106__256.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4557 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231226_123131__564 | 0 | 0.0 | 24.3366 | 0 | [105, 445] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_123131__564.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4558 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_021743__183 | 0 | 0.0 | 18.5746 | 0 | [105, 337] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_021743__183.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4559 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_123014__706 | 0 | 0.0 | 32.9257 | 0 | [193, 589] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_123014__706.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4560 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_123045__785 | 0 | 0.0 | 30.6555 | 0 | [193, 547] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_123045__785.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4561 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_021724__352 | 0 | 0.0 | 36.5097 | 0 | [193, 490] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021724__352.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4562 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231226_123508__174 | 0 | 0.0 | 29.1817 | 0 | [372, 488] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_123508__174.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4563 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_123539__947 | 0 | 0.0 | 31.0088 | 0 | [372, 518] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_123539__947.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4564 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_021926__504 | 0 | 0.0 | 30.9667 | 0 | [372, 528] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021926__504.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4565 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231226_123402__759 | 0 | 0.0 | 29.6685 | 0 | [370, 505] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_123402__759.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4566 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_123438__756 | 3 | 0.0 | 36.2493 | 4 | [370, 616] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_123438__756.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4567 | Apple-MacBook-Pro-M1 | pig_latinify | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_021855__262 | 0 | 0.0 | 41.6473 | 0 | [370, 717] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_021855__262.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4568 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_124453__654 | 0 | 0.0 | 69.6856 | 0 | [69, 414] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_124453__654.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4569 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_124546__290 | 0 | 0.0 | 52.6454 | 0 | [69, 311] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_124546__290.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4570 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_124702__361 | 1 | 0.0 | 76.0105 | 4 | [69, 452] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_124702__361.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4571 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_154522__749 | 0 | 0.0 | 104.359 | 0 | [69, 617] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_154522__749.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4572 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_154603__728 | 0 | 0.0 | 40.1094 | 0 | [69, 233] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_154603__728.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4573 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_124139__677 | 2 | 0.0 | 62.4177 | 2 | [108, 364] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_124139__677.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4574 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_124323__321 | 0 | 0.0 | 103.764 | 0 | [108, 610] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_124323__321.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4575 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_124343__710 | 0 | 0.0 | 19.945 | 0 | [108, 105] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_124343__710.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4576 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_154212__810 | 0 | 0.0 | 57.0933 | 0 | [108, 330] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_154212__810.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4577 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_154337__465 | 0 | 0.0 | 85.6078 | 0 | [108, 500] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_154337__465.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4578 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_123845__735 | 1 | 0.0 | 86.748 | 1 | [197, 461] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_123845__735.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4579 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_123943__701 | 1 | 0.0 | 56.3193 | 1 | [197, 311] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_123943__701.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4580 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_124037__842 | 0 | 0.0 | 53.6694 | 0 | [197, 295] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_124037__842.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4581 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_153945__796 | 1 | 0.0 | 81.1545 | 1 | [197, 456] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_153945__796.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4582 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_154114__227 | 1 | 0.0 | 88.179 | 1 | [197, 497] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_154114__227.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4583 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_125408__285 | 0 | 0.0 | 67.0015 | 0 | [382, 324] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125408__285.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4584 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_125419__358 | 0 | 0.0 | 10.676 | 0 | [382, 5] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125419__358.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4585 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_125614__566 | 0 | 0.0 | 114.857 | 2 | [382, 587] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125614__566.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4586 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_154818__634 | 0 | 0.0 | 10.5808 | 0 | [382, 5] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_154818__634.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4587 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_125118__280 | 0 | 0.0 | 106.0 | 0 | [380, 566] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_125118__280.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4588 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_125301__203 | 0 | 0.0 | 102.401 | 0 | [380, 522] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_125301__203.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4589 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_154703__260 | 0 | 0.0 | 60.8068 | 1 | [380, 304] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_154703__260.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4590 | Apple-MacBook-Pro-M1 | pig_latinify | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_154807__995 | 0 | 0.0 | 63.5109 | 0 | [380, 320] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_154807__995.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4591 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_140027__751 | 0 | 0.0 | 14.7899 | 0 | [73, 373] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_140027__751.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4592 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_140041__555 | 0 | 0.0 | 13.6465 | 0 | [73, 343] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_140041__555.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4593 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_021444__549 | 0 | 0.0 | 16.425 | 0 | [73, 412] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_021444__549.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4594 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_135953__691 | 0 | 0.0 | 22.0185 | 0 | [114, 549] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_135953__691.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4595 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_140012__160 | 1 | 0.0 | 19.0662 | 2 | [114, 475] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_140012__160.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4596 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_021427__571 | 0 | 0.0 | 11.942 | 0 | [114, 292] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_021427__571.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4597 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_135913__290 | 0 | 0.0 | 24.036 | 0 | [202, 424] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_135913__290.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4598 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_135931__848 | 0 | 0.0 | 17.8621 | 0 | [202, 428] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_135931__848.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4599 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_021415__573 | 0 | 0.0 | 25.2061 | 0 | [202, 461] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021415__573.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4600 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_140214__724 | 0 | 0.0 | 16.7107 | 0 | [381, 371] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140214__724.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4601 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_140239__175 | 0 | 0.0 | 24.9271 | 0 | [381, 571] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140239__175.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4602 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_021530__141 | 0 | 0.0 | 22.7108 | 0 | [381, 515] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021530__141.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4603 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_140137__270 | 0 | 0.0 | 23.8317 | 0 | [379, 545] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_140137__270.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4604 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_140157__205 | 0 | 0.0 | 20.0822 | 0 | [379, 454] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_140157__205.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4605 | Apple-MacBook-Pro-M1 | pig_latinify | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_021507__783 | 0 | 0.0 | 23.0778 | 0 | [379, 524] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_021507__783.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4606 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231214_080705__434 | 0 | 0.0 | 17.7645 | 0 | [70, 523] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_080705__434.json | 25.0 | missing | missing | missing | |
| 4607 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231225_122830__431 | 0 | 0.0 | 9.95201 | 0 | [71, 322] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_122830__431.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4608 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231225_122839__519 | 0 | 0.0 | 9.12528 | 0 | [71, 294] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_122839__519.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4609 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231227_013921__812 | 0 | 0.0 | 14.5149 | 0 | [71, 468] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_013921__812.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4610 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_080648__302 | 0 | 0.0 | 13.7355 | 0 | [99, 398] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_080648__302.json | 50.0 | missing | missing | missing | |
| 4611 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_122810__422 | 0 | 0.0 | 8.84315 | 2 | [112, 279] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_122810__422.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4612 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_122820__161 | 0 | 0.0 | 9.99183 | 0 | [112, 316] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_122820__161.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4613 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_013906__634 | 1 | 0.0 | 10.1038 | 2 | [112, 318] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_013906__634.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4614 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_080634__525 | 0 | 0.0 | 24.5301 | 0 | [187, 659] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080634__525.json | 25.0 | missing | missing | missing | |
| 4615 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_122749__683 | 0 | 0.0 | 20.469 | 0 | [200, 470] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122749__683.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4616 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_122801__716 | 0 | 0.0 | 11.3589 | 0 | [200, 344] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_122801__716.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4617 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_013856__307 | 0 | 0.0 | 16.051 | 0 | [200, 334] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_013856__307.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4618 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_080820__640 | 0 | 0.0 | 29.216 | 0 | [11, 767] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_080820__640.json | 0.0 | missing | missing | missing | |
| 4619 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_123004__554 | 0 | 0.0 | 18.695 | 0 | [379, 540] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_123004__554.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4620 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_123021__579 | 0 | 0.0 | 16.4639 | 0 | [379, 472] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_123021__579.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4621 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_013953__802 | 0 | 0.0 | 11.5938 | 0 | [379, 318] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_013953__802.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4622 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_080750__534 | 0 | 0.0 | 27.2855 | 0 | [370, 646] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_080750__534.json | 0.0 | missing | missing | missing | |
| 4623 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_122920__425 | 0 | 0.0 | 18.2473 | 0 | [377, 527] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_122920__425.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4624 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_122946__954 | 0 | 0.0 | 25.3982 | 0 | [377, 742] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_122946__954.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4625 | Apple-MacBook-Pro-M1 | pig_latinify | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_013941__393 | 0 | 0.0 | 20.5509 | 0 | [377, 593] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_013941__393.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4626 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231214_082056__322 | 0 | 0.0 | 13.8015 | 0 | [70, 410] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__InJulia__1SHOT__20231214_082056__322.json | 0.0 | missing | missing | missing | |
| 4627 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_132003__532 | 0 | 0.0 | 6.43752 | 0 | [73, 109] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__InJulia__1SHOT__20231225_132003__532.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4628 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_132021__663 | 0 | 0.0 | 18.6527 | 0 | [73, 340] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__InJulia__1SHOT__20231225_132021__663.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4629 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231227_015854__343 | 0 | 0.0 | 3.75015 | 0 | [73, 57] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__InJulia__1SHOT__20231227_015854__343.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4630 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_082042__155 | 0 | 0.0 | 15.77 | 0 | [99, 454] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_082042__155.json | 50.0 | missing | missing | missing | |
| 4631 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_131940__532 | 0 | 0.0 | 16.3154 | 0 | [112, 291] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_131940__532.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4632 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_131956__508 | 0 | 0.0 | 16.203 | 0 | [112, 289] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_131956__508.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4633 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_015850__363 | 0 | 0.0 | 2.84848 | 0 | [112, 35] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_015850__363.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4634 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_082026__383 | 0 | 0.0 | 23.0893 | 0 | [187, 622] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_082026__383.json | 25.0 | missing | missing | missing | |
| 4635 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_131856__753 | 0 | 0.0 | 47.7254 | 0 | [200, 666] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_131856__753.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4636 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_131924__165 | 0 | 0.0 | 27.2942 | 0 | [200, 472] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_131924__165.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4637 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_015847__160 | 0 | 0.0 | 28.1188 | 0 | [200, 324] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_015847__160.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4638 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_082230__141 | 0 | 0.0 | 39.8219 | 0 | [11, 1011] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_082230__141.json | 25.0 | missing | missing | missing | |
| 4639 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_132156__787 | 0 | 0.0 | 15.9308 | 0 | [376, 236] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_132156__787.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4640 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_132238__753 | 0 | 0.0 | 42.3446 | 0 | [376, 693] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_132238__753.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4641 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_015953__599 | 0 | 0.0 | 5.01372 | 0 | [376, 36] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_015953__599.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4642 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_082150__751 | 0 | 0.0 | 41.691 | 0 | [370, 984] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_082150__751.json | 25.0 | missing | missing | missing | |
| 4643 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_132054__879 | 0 | 0.0 | 15.7177 | 0 | [373, 232] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_132054__879.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4644 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_132140__359 | 0 | 0.0 | 45.8638 | 0 | [373, 752] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_132140__359.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4645 | Apple-MacBook-Pro-M1 | pig_latinify | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_015948__418 | 0 | 0.0 | 53.7484 | 0 | [373, 877] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_015948__418.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4646 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_140421__897 | 0 | 0.0 | 31.824 | 0 | [63, 1160] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_140421__897.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4647 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_140440__944 | 0 | 0.0 | 18.955 | 0 | [63, 722] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_140440__944.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4648 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_021622__312 | 0 | 0.0 | 36.9302 | 0 | [63, 1314] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_021622__312.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4649 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_140318__388 | 0 | 0.0 | 11.0023 | 0 | [100, 420] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_140318__388.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4650 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_140349__476 | 0 | 0.0 | 30.9699 | 0 | [100, 1119] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_140349__476.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4651 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_021545__722 | 0 | 0.0 | 10.3744 | 0 | [100, 394] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_021545__722.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4652 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_140305__636 | 0 | 0.0 | 25.9261 | 0 | [187, 812] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140305__636.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4653 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_140307__982 | 0 | 0.0 | 1.66878 | 0 | [187, 45] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_140307__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4654 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_021535__171 | 0 | 0.0 | 4.90598 | 0 | [187, 36] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021535__171.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4655 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_140607__579 | 0 | 0.0 | 20.9012 | 0 | [352, 715] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140607__579.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4656 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_140610__883 | 0 | 0.0 | 3.17258 | 0 | [352, 80] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_140610__883.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4657 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_021648__301 | 0 | 0.0 | 12.7265 | 0 | [352, 430] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021648__301.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4658 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_140531__619 | 0 | 0.0 | 28.9073 | 0 | [349, 978] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_140531__619.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4659 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_140546__576 | 0 | 0.0 | 15.3841 | 0 | [349, 527] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_140546__576.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4660 | Apple-MacBook-Pro-M1 | pig_latinify | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_021635__476 | 0 | 0.0 | 13.0256 | 0 | [349, 441] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_021635__476.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4661 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231214_082317__458 | 0 | 0.0 | 13.1889 | 0 | [70, 392] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_082317__458.json | 25.0 | missing | missing | missing | |
| 4662 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_132708__917 | 3 | 0.0 | 62.7314 | 4 | [81, 492] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_132708__917.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4663 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_132801__488 | 1 | 0.0 | 52.7308 | 4 | [81, 413] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_132801__488.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4664 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231227_020250__865 | 1 | 0.0 | 45.0245 | 2 | [81, 345] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_020250__865.json | 67.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4665 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_082304__370 | 0 | 0.0 | 14.6784 | 0 | [99, 423] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_082304__370.json | 0.0 | missing | missing | missing | |
| 4666 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_132510__441 | 2 | 0.0 | 43.5223 | 2 | [120, 333] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_132510__441.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4667 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_132605__301 | 2 | 0.0 | 54.7192 | 2 | [120, 422] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_132605__301.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4668 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_020204__927 | 3 | 0.0 | 54.7475 | 4 | [120, 417] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_020204__927.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4669 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_082249__196 | 0 | 0.0 | 19.2682 | 0 | [187, 518] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_082249__196.json | 25.0 | missing | missing | missing | |
| 4670 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_132329__447 | 0 | 0.0 | 50.7477 | 0 | [208, 197] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_132329__447.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4671 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_132426__391 | 2 | 0.0 | 57.2676 | 2 | [208, 424] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_132426__391.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4672 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_020109__494 | 3 | 0.0 | 76.3443 | 2 | [208, 409] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_020109__494.json | 77.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4673 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_082410__421 | 0 | 0.0 | 9.46659 | 0 | [11, 264] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_082410__421.json | 0.0 | missing | missing | missing | |
| 4674 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_133247__654 | 0 | 0.0 | 53.572 | 0 | [384, 361] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_133247__654.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4675 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_133344__452 | 2 | 0.0 | 57.5674 | 2 | [384, 392] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_133344__452.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4676 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_020448__436 | 2 | 0.0 | 57.8705 | 2 | [384, 391] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_020448__436.json | 72.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 4677 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_082400__618 | 0 | 0.0 | 24.6635 | 4 | [370, 581] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_082400__618.json | 75.0 | missing | missing | missing | |
| 4678 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_133034__780 | 0 | 0.0 | 61.7081 | 0 | [381, 424] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_133034__780.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4679 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_133153__836 | 0 | 0.0 | 78.833 | 0 | [381, 555] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_133153__836.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4680 | Apple-MacBook-Pro-M1 | pig_latinify | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_020350__474 | 1 | 0.0 | 59.9814 | 1 | [381, 408] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_020350__474.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4681 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_135056__605 | 0 | 0.0 | 24.4731 | 0 | [73, 416] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_135056__605.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4682 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_135124__417 | 1 | 0.0 | 27.5513 | 1 | [73, 469] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_135124__417.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4683 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_021130__283 | 0 | 0.0 | 28.0632 | 0 | [73, 476] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_021130__283.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4684 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_135002__442 | 0 | 0.0 | 13.8568 | 0 | [114, 227] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_135002__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4685 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_135032__600 | 0 | 0.0 | 29.479 | 0 | [114, 495] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_135032__600.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4686 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_021102__810 | 0 | 0.0 | 29.5296 | 0 | [114, 494] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_021102__810.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4687 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_134926__616 | 0 | 0.0 | 36.9362 | 0 | [202, 453] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_134926__616.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4688 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_134949__995 | 0 | 0.0 | 22.4764 | 0 | [202, 360] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_134949__995.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4689 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_021033__639 | 0 | 0.0 | 27.3627 | 0 | [202, 293] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_021033__639.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4690 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_135331__162 | 0 | 0.0 | 24.9466 | 0 | [381, 373] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_135331__162.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4691 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_135413__350 | 0 | 0.0 | 41.3785 | 0 | [381, 641] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_135413__350.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4692 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_021222__286 | 0 | 0.0 | 22.84 | 0 | [381, 337] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_021222__286.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4693 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_135244__159 | 0 | 0.0 | 23.9556 | 0 | [379, 357] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_135244__159.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4694 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_135306__568 | 0 | 0.0 | 21.7273 | 0 | [379, 320] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_135306__568.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4695 | Apple-MacBook-Pro-M1 | pig_latinify | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_021159__626 | 0 | 0.0 | 28.7177 | 0 | [379, 434] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_021159__626.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4696 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231214_081858__617 | 0 | 0.0 | 14.8352 | 0 | [70, 441] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_081858__617.json | 25.0 | missing | missing | missing | |
| 4697 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_131644__856 | 0 | 0.0 | 7.91779 | 0 | [75, 450] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_131644__856.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4698 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231225_131654__766 | 0 | 0.0 | 9.14437 | 0 | [75, 516] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_131654__766.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4699 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231227_015800__398 | 0 | 0.0 | 10.3343 | 0 | [75, 573] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_015800__398.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4700 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_081843__458 | 0 | 0.0 | 12.5548 | 0 | [99, 362] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_081843__458.json | 0.0 | missing | missing | missing | |
| 4701 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_131627__100 | 0 | 0.0 | 8.09751 | 0 | [112, 451] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_131627__100.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4702 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_131637__198 | 0 | 0.0 | 9.43657 | 0 | [112, 523] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_131637__198.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4703 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_015750__702 | 0 | 0.0 | 10.4733 | 0 | [112, 570] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_015750__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4704 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_081830__586 | 0 | 0.0 | 24.7267 | 0 | [187, 663] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_081830__586.json | 50.0 | missing | missing | missing | |
| 4705 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_131612__431 | 0 | 0.0 | 18.1742 | 0 | [196, 788] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_131612__431.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4706 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_131619__542 | 0 | 0.0 | 7.37589 | 0 | [196, 389] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_131619__542.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4707 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_015739__327 | 0 | 0.0 | 12.5823 | 0 | [196, 515] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_015739__327.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4708 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_082003__715 | 0 | 0.0 | 25.9145 | 0 | [11, 688] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_082003__715.json | 50.0 | missing | missing | missing | |
| 4709 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_131751__851 | 0 | 0.0 | 13.9418 | 0 | [362, 666] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_131751__851.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4710 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_131809__403 | 0 | 0.0 | 17.8849 | 0 | [362, 847] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_131809__403.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4711 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_015819__825 | 0 | 0.0 | 12.0121 | 0 | [362, 568] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_015819__825.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4712 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_081937__240 | 0 | 0.0 | 25.5711 | 0 | [370, 603] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_081937__240.json | 0.0 | missing | missing | missing | |
| 4713 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_131728__333 | 0 | 0.0 | 10.3448 | 0 | [360, 492] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_131728__333.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4714 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_131737__616 | 0 | 0.0 | 8.99895 | 0 | [360, 427] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_131737__616.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4715 | Apple-MacBook-Pro-M1 | pig_latinify | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_015807__390 | 0 | 0.0 | 7.26255 | 0 | [360, 334] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_015807__390.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4716 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231214_080904__504 | 0 | 0.0 | 12.1193 | 0 | [70, 361] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_080904__504.json | 25.0 | missing | missing | missing | |
| 4717 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_123132__737 | 0 | 0.0 | 15.6397 | 0 | [73, 507] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_123132__737.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4718 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231225_123142__924 | 0 | 0.0 | 9.73797 | 0 | [73, 314] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_123142__924.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4719 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_014042__401 | 0 | 0.0 | 12.1524 | 0 | [73, 391] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_014042__401.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4720 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_080852__748 | 0 | 0.0 | 10.807 | 0 | [99, 312] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_080852__748.json | 25.0 | missing | missing | missing | |
| 4721 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_123104__817 | 0 | 0.0 | 14.284 | 0 | [114, 455] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_123104__817.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4722 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_123116__383 | 0 | 0.0 | 12.4757 | 0 | [114, 397] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_123116__383.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4723 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_014029__246 | 0 | 0.0 | 17.399 | 0 | [114, 551] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_014029__246.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4724 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_080841__522 | 0 | 0.0 | 21.3463 | 0 | [187, 576] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_080841__522.json | 0.0 | missing | missing | missing | |
| 4725 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_123037__156 | 0 | 0.0 | 16.3647 | 0 | [202, 333] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_123037__156.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4726 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_123049__829 | 0 | 0.0 | 11.8429 | 0 | [202, 357] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_123049__829.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4727 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_014012__897 | 1 | 0.0 | 18.8181 | 1 | [202, 415] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_014012__897.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4728 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_081006__474 | 0 | 0.0 | 15.4694 | 0 | [11, 425] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_081006__474.json | 0.0 | missing | missing | missing | |
| 4729 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_123241__948 | 1 | 0.0 | 11.1697 | 1 | [381, 306] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_123241__948.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4730 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_123253__237 | 0 | 0.0 | 12.2955 | 0 | [381, 342] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_123253__237.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4731 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_014116__936 | 0 | 0.0 | 22.062 | 1 | [381, 637] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_014116__936.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4732 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_080950__590 | 0 | 0.0 | 22.8869 | 0 | [370, 536] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_080950__590.json | 50.0 | missing | missing | missing | |
| 4733 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_123215__461 | 0 | 0.0 | 12.857 | 1 | [379, 359] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_123215__461.json | 56.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4734 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_123229__164 | 1 | 0.0 | 14.1804 | 1 | [379, 401] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_123229__164.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4735 | Apple-MacBook-Pro-M1 | pig_latinify | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_014054__288 | 0 | 0.0 | 12.641 | 0 | [379, 350] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_014054__288.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4736 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231214_081055__763 | 0 | 0.0 | 17.8315 | 0 | [70, 524] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_081055__763.json | 25.0 | missing | missing | missing | |
| 4737 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_123848__320 | 0 | 0.0 | 109.408 | 0 | [69, 822] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_123848__320.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4738 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_124103__605 | 0 | 0.0 | 134.581 | 0 | [69, 1002] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_124103__605.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4739 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231227_014510__408 | 0 | 0.0 | 70.8413 | 0 | [69, 533] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_014510__408.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4740 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_081037__415 | 0 | 0.0 | 13.6236 | 0 | [99, 394] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_081037__415.json | 25.0 | missing | missing | missing | |
| 4741 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_123608__455 | 0 | 0.0 | 33.2181 | 0 | [108, 242] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_123608__455.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4742 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_123659__813 | 0 | 0.0 | 50.1418 | 0 | [108, 372] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_123659__813.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4743 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_014359__337 | 0 | 0.0 | 62.8498 | 0 | [108, 467] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_014359__337.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4744 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_081023__306 | 0 | 0.0 | 17.1399 | 0 | [187, 463] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_081023__306.json | 50.0 | missing | missing | missing | |
| 4745 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_123442__694 | 0 | 0.0 | 109.179 | 0 | [197, 621] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_123442__694.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4746 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_123535__408 | 1 | 0.0 | 52.6172 | 1 | [197, 373] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_123535__408.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4747 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_014255__455 | 0 | 0.0 | 97.788 | 3 | [197, 539] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_014255__455.json | 68.75 | missing | {\n "num_gpu": 99\n} | missing | |
| 4748 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_081158__371 | 0 | 0.0 | 16.0121 | 0 | [11, 440] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_081158__371.json | 0.0 | missing | missing | missing | |
| 4749 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_124650__770 | 0 | 0.0 | 10.0321 | 0 | [382, 17] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_124650__770.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4750 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_124813__519 | 0 | 0.0 | 82.4902 | 0 | [382, 558] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_124813__519.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4751 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_014714__686 | 0 | 0.0 | 84.0358 | 0 | [382, 567] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_014714__686.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4752 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_081142__961 | 0 | 0.0 | 26.2699 | 0 | [370, 621] | 0.4.0 | 4 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_081142__961.json | 50.0 | missing | missing | missing | |
| 4753 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_124553__428 | 0 | 0.0 | 96.1783 | 0 | [380, 657] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_124553__428.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4754 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_124640__289 | 1 | 0.0 | 47.0937 | 1 | [380, 299] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_124640__289.json | 61.25 | missing | {\n "num_gpu": 99\n} | missing | |
| 4755 | Apple-MacBook-Pro-M1 | pig_latinify | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_014550__618 | 0 | 0.0 | 40.2905 | 0 | [380, 247] | 0.6.0 | 4 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/pig_latinify/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_014550__618.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4756 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231214_083245__590 | 0 | 0.0 | 10.1677 | 1 | [99, 294] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_083245__590.json | 58.3333 | missing | missing | missing | |
| 4757 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_095852__559 | 0 | 0.0 | 18.9375 | 2 | [107, 341] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_095852__559.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4758 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_095918__253 | 0 | 0.0 | 26.0104 | 0 | [107, 470] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_095918__253.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4759 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231227_023107__730 | 0 | 0.0 | 18.3067 | 2 | [107, 327] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_023107__730.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4760 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231227_081626__775 | 0 | 0.0 | 8.66274 | 0 | [107, 147] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_081626__775.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4761 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083235__267 | 0 | 0.0 | 2.74352 | 0 | [128, 60] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_083235__267.json | 50.0 | missing | missing | missing | |
| 4762 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_095825__934 | 0 | 0.0 | 16.8032 | 1 | [145, 294] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_095825__934.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4763 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_095833__643 | 0 | 0.0 | 7.49558 | 2 | [145, 119] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_095833__643.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4764 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_023048__360 | 0 | 0.0 | 22.5964 | 2 | [145, 400] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_023048__360.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4765 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_081617__675 | 0 | 0.0 | 12.8509 | 2 | [145, 220] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_081617__675.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4766 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_083232__945 | 0 | 0.0 | 19.0877 | 0 | [229, 496] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083232__945.json | 50.0 | missing | missing | missing | |
| 4767 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_095751__741 | 0 | 0.0 | 29.9371 | 3 | [247, 334] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_095751__741.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4768 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_095808__674 | 0 | 0.0 | 15.9587 | 0 | [247, 256] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_095808__674.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4769 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_023026__478 | 0 | 0.0 | 30.1953 | 2 | [247, 349] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_023026__478.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4770 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_081604__194 | 0 | 0.0 | 23.1759 | 0 | [247, 220] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_081604__194.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4771 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_083351__576 | 0 | 0.0 | 30.824 | 0 | [11, 799] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083351__576.json | 50.0 | missing | missing | missing | |
| 4772 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_100053__339 | 0 | 0.0 | 28.1159 | 1 | [410, 440] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100053__339.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4773 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_100126__556 | 0 | 0.0 | 32.6241 | 0 | [410, 514] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100126__556.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4774 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_023206__786 | 2 | 0.0 | 21.9796 | 1 | [410, 337] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_023206__786.json | 68.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4775 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_081713__675 | 0 | 0.0 | 21.973 | 1 | [410, 337] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_081713__675.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4776 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_083320__385 | 0 | 0.0 | 19.228 | 0 | [399, 428] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_083320__385.json | 0.0 | missing | missing | missing | |
| 4777 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_100008__837 | 0 | 0.0 | 17.4432 | 0 | [407, 257] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_100008__837.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4778 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_100024__592 | 0 | 0.0 | 16.7871 | 1 | [407, 240] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_100024__592.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4779 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_023144__504 | 0 | 0.0 | 37.0013 | 2 | [407, 595] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_023144__504.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4780 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_081651__914 | 0 | 0.0 | 25.2054 | 0 | [407, 393] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_081651__914.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4781 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231214_083430__766 | 0 | 0.0 | 14.2994 | 0 | [99, 414] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_083430__766.json | 25.0 | missing | missing | missing | |
| 4782 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231225_100223__568 | 0 | 0.0 | 7.44002 | 0 | [81, 125] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_100223__568.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4783 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_100230__434 | 0 | 0.0 | 7.48011 | 0 | [81, 127] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_100230__434.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4784 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083415__768 | 0 | 0.0 | 10.619 | 0 | [128, 301] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_083415__768.json | 50.0 | missing | missing | missing | |
| 4785 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100204__442 | 0 | 0.0 | 21.1519 | 0 | [82, 381] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_100204__442.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4786 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100215__923 | 0 | 0.0 | 11.0315 | 0 | [82, 192] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_100215__923.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4787 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_083405__320 | 0 | 0.0 | 14.1434 | 1 | [229, 362] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083405__320.json | 58.3333 | missing | missing | missing | |
| 4788 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_100141__735 | 0 | 0.0 | 15.3805 | 0 | [122, 84] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100141__735.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4789 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_100143__983 | 0 | 0.0 | 2.0963 | 0 | [122, 20] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100143__983.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4790 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_083533__725 | 0 | 0.0 | 17.2072 | 0 | [11, 466] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083533__725.json | 25.0 | missing | missing | missing | |
| 4791 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_100347__874 | 0 | 0.0 | 15.5741 | 0 | [99, 273] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100347__874.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4792 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_100351__261 | 0 | 0.0 | 4.27512 | 0 | [99, 62] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100351__261.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4793 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_083515__394 | 0 | 0.0 | 28.2869 | 0 | [399, 655] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_083515__394.json | 0.0 | missing | missing | missing | |
| 4794 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_100315__736 | 0 | 0.0 | 7.34425 | 0 | [96, 123] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_100315__736.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4795 | Apple-MacBook-Pro-M1 | q_and_a_extractor | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_100331__554 | 0 | 0.0 | 16.236 | 0 | [96, 288] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_100331__554.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4796 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_111843__906 | 0 | 0.0 | 48.7137 | 0 | [107, 283] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_111843__906.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4797 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_082923__513 | 0 | 0.0 | 63.564 | 2 | [107, 380] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_082923__513.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4798 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_102435__200 | 0 | 0.0 | 50.6783 | 0 | [148, 295] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_102435__200.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4799 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_102514__119 | 0 | 0.0 | 39.7093 | 0 | [148, 227] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_102514__119.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4800 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_082819__573 | 0 | 0.0 | 44.6934 | 0 | [148, 257] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_082819__573.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4801 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_102257__567 | 0 | 0.0 | 77.1385 | 0 | [250, 277] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_102257__567.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4802 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_102344__800 | 0 | 0.0 | 47.1144 | 0 | [250, 256] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_102344__800.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4803 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_082734__750 | 0 | 0.0 | 91.8928 | 0 | [250, 377] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_082734__750.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4804 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_112409__143 | 0 | 0.0 | 72.6561 | 0 | [436, 375] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_112409__143.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4805 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_112536__495 | 0 | 0.0 | 86.5173 | 0 | [436, 458] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_112536__495.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4806 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_083159__492 | 3 | 0.0 | 76.5146 | 3 | [436, 398] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_083159__492.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4807 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_112130__292 | 0 | 0.0 | 50.5665 | 3 | [434, 235] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_112130__292.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4808 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_112256__889 | 0 | 0.0 | 85.5161 | 0 | [434, 442] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_112256__889.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4809 | Apple-MacBook-Pro-M1 | q_and_a_extractor | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_083042__182 | 0 | 0.0 | 78.9185 | 0 | [434, 412] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_083042__182.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4810 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_084044__258 | 0 | 0.0 | 10.1725 | 0 | [102, 386] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_084044__258.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4811 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_125708__394 | 0 | 0.0 | 9.26925 | 0 | [102, 345] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_125708__394.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4812 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_125728__278 | 0 | 0.0 | 20.3521 | 0 | [102, 745] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_125728__278.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4813 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_125739__654 | 0 | 0.0 | 10.6576 | 0 | [102, 404] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_125739__654.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4814 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_084033__410 | 0 | 0.0 | 5.95006 | 0 | [139, 219] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_084033__410.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4815 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_125635__419 | 0 | 0.0 | 3.49661 | 0 | [139, 119] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_125635__419.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4816 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_125641__189 | 0 | 0.0 | 6.55809 | 0 | [139, 232] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_125641__189.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4817 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_125659__581 | 0 | 0.0 | 17.2792 | 0 | [139, 616] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_125659__581.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4818 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_084028__555 | 0 | 0.0 | 11.4824 | 0 | [238, 282] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_084028__555.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4819 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_125623__899 | 0 | 0.0 | 8.2374 | 0 | [238, 150] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_125623__899.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4820 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_125627__172 | 0 | 0.0 | 4.30891 | 0 | [238, 136] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_125627__172.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4821 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_125631__187 | 0 | 0.0 | 4.14138 | 0 | [238, 129] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_125631__187.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4822 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_084057__719 | 0 | 0.0 | 6.96847 | 0 | [391, 212] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084057__719.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4823 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_125815__592 | 0 | 0.0 | 9.32438 | 0 | [391, 297] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125815__592.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4824 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_125827__862 | 0 | 0.0 | 11.381 | 0 | [391, 366] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125827__862.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4825 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_125836__672 | 0 | 0.0 | 9.78339 | 0 | [391, 313] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_125836__672.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4826 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_084050__449 | 0 | 0.0 | 5.99048 | 0 | [388, 176] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_084050__449.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4827 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_125748__860 | 0 | 0.0 | 8.84251 | 0 | [388, 280] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_125748__860.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4828 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_125757__645 | 0 | 0.0 | 9.46161 | 0 | [388, 302] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_125757__645.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4829 | Apple-MacBook-Pro-M1 | q_and_a_extractor | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_125806__582 | 0 | 0.0 | 8.46973 | 0 | [388, 266] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_125806__582.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4830 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | InJulia | 1SHOT | true | false | 5 | 20231214_082500__225 | 0 | 0.0 | 14.194 | 0 | [99, 411] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__InJulia__1SHOT__20231214_082500__225.json | 25.0 | missing | missing | missing | |
| 4831 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_093736__274 | 0 | 0.0 | 28.721 | 0 | [99, 806] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__InJulia__1SHOT__20231225_093736__274.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4832 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | InJulia | 1SHOT | true | true | 5 | 20231225_093755__702 | 0 | 0.0 | 19.352 | 0 | [1, 578] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__InJulia__1SHOT__20231225_093755__702.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4833 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | InJulia | 1SHOT | true | true | 5 | 20231227_022112__800 | 0 | 0.0 | 10.0206 | 0 | [99, 295] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__InJulia__1SHOT__20231227_022112__800.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4834 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | InJulia | 1SHOT | true | false | 5 | 20231227_080554__504 | 0 | 0.0 | 21.2807 | 0 | [99, 614] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__InJulia__1SHOT__20231227_080554__504.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4835 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_082446__587 | 0 | 0.0 | 15.4985 | 0 | [128, 440] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_082446__587.json | 25.0 | missing | missing | missing | |
| 4836 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_093658__707 | 0 | 0.0 | 6.29249 | 0 | [128, 173] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_093658__707.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4837 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_093707__367 | 0 | 0.0 | 8.54217 | 0 | [1, 266] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_093707__367.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4838 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_022102__522 | 0 | 0.0 | 11.2893 | 0 | [128, 326] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_022102__522.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4839 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_080531__308 | 0 | 0.0 | 10.1158 | 0 | [128, 287] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_080531__308.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4840 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_082431__596 | 0 | 0.0 | 20.6258 | 0 | [229, 537] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_082431__596.json | 50.0 | missing | missing | missing | |
| 4841 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_093640__413 | 0 | 0.0 | 21.9471 | 2 | [247, 436] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093640__413.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4842 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_093652__233 | 0 | 0.0 | 12.0759 | 0 | [1, 356] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093652__233.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4843 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_022051__506 | 0 | 0.0 | 23.1667 | 0 | [247, 485] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_022051__506.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4844 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_080521__220 | 0 | 0.0 | 23.107 | 0 | [247, 373] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_080521__220.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4845 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_082600__493 | 0 | 0.0 | 19.5082 | 0 | [11, 525] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_082600__493.json | 0.0 | missing | missing | missing | |
| 4846 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_093930__737 | 0 | 0.0 | 21.6914 | 0 | [11, 582] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093930__737.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4847 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093951__414 | 0 | 0.0 | 20.4782 | 0 | [1, 557] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093951__414.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4848 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_022215__443 | 0 | 0.0 | 24.5559 | 0 | [11, 659] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_022215__443.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4849 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_080621__394 | 0 | 0.0 | 4.80478 | 0 | [11, 135] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_080621__394.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4850 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_082541__575 | 0 | 0.0 | 23.0249 | 0 | [399, 525] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_082541__575.json | 50.0 | missing | missing | missing | |
| 4851 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_093846__992 | 0 | 0.0 | 19.9981 | 0 | [399, 450] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_093846__992.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4852 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_093908__419 | 0 | 0.0 | 22.766 | 0 | [1, 613] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_093908__419.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4853 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_022150__816 | 0 | 0.0 | 37.8367 | 0 | [399, 892] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_022150__816.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4854 | Apple-MacBook-Pro-M1 | q_and_a_extractor | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_080616__576 | 0 | 0.0 | 21.8456 | 0 | [399, 502] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_080616__576.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4855 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | InJulia | 1SHOT | true | false | 5 | 20231214_083623__729 | 0 | 0.0 | 20.608 | 0 | [99, 590] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__InJulia__1SHOT__20231214_083623__729.json | 25.0 | missing | missing | missing | |
| 4856 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_100452__820 | 0 | 0.0 | 9.60036 | 2 | [99, 309] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__InJulia__1SHOT__20231225_100452__820.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4857 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_100500__136 | 0 | 0.0 | 8.27445 | 1 | [99, 265] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__InJulia__1SHOT__20231225_100500__136.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4858 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | InJulia | 1SHOT | true | true | 5 | 20231227_023242__227 | 0 | 0.0 | 10.6637 | 1 | [99, 343] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__InJulia__1SHOT__20231227_023242__227.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4859 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | InJulia | 1SHOT | true | true | 5 | 20231227_081747__479 | 1 | 0.0 | 7.57262 | 2 | [99, 240] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__InJulia__1SHOT__20231227_081747__479.json | 71.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4860 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083602__361 | 0 | 0.0 | 2.72257 | 0 | [128, 59] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_083602__361.json | 50.0 | missing | missing | missing | |
| 4861 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_100427__371 | 0 | 0.0 | 12.1747 | 0 | [138, 384] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_100427__371.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4862 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100442__925 | 0 | 0.0 | 14.3777 | 0 | [138, 454] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_100442__925.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4863 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_023231__146 | 0 | 0.0 | 9.88232 | 2 | [138, 312] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_023231__146.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4864 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_081740__369 | 0 | 0.0 | 7.64161 | 0 | [138, 236] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_081740__369.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4865 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_083559__846 | 0 | 0.0 | 26.5545 | 0 | [229, 690] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083559__846.json | 50.0 | missing | missing | missing | |
| 4866 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_100406__545 | 0 | 0.0 | 14.967 | 0 | [239, 255] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100406__545.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4867 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_100415__864 | 0 | 0.0 | 8.99197 | 2 | [239, 262] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100415__864.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4868 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_023221__803 | 0 | 0.0 | 14.8256 | 0 | [239, 267] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_023221__803.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4869 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_081732__292 | 0 | 0.0 | 18.985 | 0 | [239, 400] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_081732__292.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4870 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_083734__729 | 0 | 0.0 | 29.4308 | 0 | [11, 767] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083734__729.json | 50.0 | missing | missing | missing | |
| 4871 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_100552__862 | 1 | 0.0 | 14.1844 | 2 | [402, 398] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100552__862.json | 71.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4872 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_100602__770 | 0 | 0.0 | 9.70611 | 2 | [402, 259] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100602__770.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4873 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_081808__499 | 0 | 0.0 | 8.51722 | 0 | [402, 217] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_081808__499.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4874 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_083704__958 | 0 | 0.0 | 23.7742 | 0 | [399, 544] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_083704__958.json | 50.0 | missing | missing | missing | |
| 4875 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_100528__669 | 0 | 0.0 | 9.91606 | 2 | [399, 265] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_100528__669.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4876 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_100538__655 | 0 | 0.0 | 9.26251 | 3 | [399, 245] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_100538__655.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4877 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_081800__869 | 0 | 0.0 | 12.579 | 0 | [399, 344] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_081800__869.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4878 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_182630__968 | 0 | 0.0 | 20.1125 | 0 | [99, 373] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182630__968.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4879 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_182647__648 | 1 | 0.0 | 16.794 | 3 | [99, 321] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182647__648.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4880 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_182705__820 | 0 | 0.0 | 17.8197 | 0 | [99, 341] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_182705__820.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4881 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_182534__861 | 1 | 0.0 | 19.7071 | 3 | [138, 368] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182534__861.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4882 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_182550__300 | 0 | 0.0 | 15.8922 | 0 | [138, 295] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182550__300.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4883 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_182610__864 | 1 | 0.0 | 19.6312 | 2 | [138, 366] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_182610__864.json | 71.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4884 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_182441__308 | 0 | 0.0 | 19.5005 | 0 | [239, 356] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182441__308.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4885 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_182458__436 | 0 | 0.0 | 17.0908 | 0 | [239, 302] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182458__436.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4886 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_182515__992 | 3 | 0.0 | 16.8914 | 3 | [239, 302] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182515__992.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4887 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_182825__195 | 1 | 0.0 | 30.765 | 2 | [402, 525] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182825__195.json | 71.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4888 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_182844__657 | 0 | 0.0 | 19.224 | 3 | [402, 328] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182844__657.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4889 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_182913__575 | 1 | 0.0 | 28.4236 | 2 | [402, 501] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_182913__575.json | 71.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4890 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_182717__997 | 0 | 0.0 | 11.9921 | 0 | [399, 191] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182717__997.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4891 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_182735__949 | 3 | 0.0 | 17.3796 | 2 | [399, 294] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182735__949.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4892 | Apple-MacBook-Pro-M1 | q_and_a_extractor | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_182754__580 | 0 | 0.0 | 19.2066 | 0 | [399, 328] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_182754__580.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4893 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_113118__580 | 0 | 0.0 | 11.229 | 0 | [99, 277] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_113118__580.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4894 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_113125__853 | 0 | 0.0 | 7.46462 | 0 | [99, 179] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_113125__853.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4895 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_083442__537 | 0 | 0.0 | 11.2986 | 1 | [99, 277] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_083442__537.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4896 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_113057__872 | 0 | 0.0 | 6.26122 | 0 | [140, 143] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_113057__872.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4897 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_113106__813 | 0 | 0.0 | 9.71446 | 0 | [140, 232] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_113106__813.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4898 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_083431__270 | 0 | 0.0 | 7.60879 | 0 | [140, 177] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_083431__270.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4899 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_113035__160 | 0 | 0.0 | 23.9099 | 1 | [241, 435] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113035__160.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4900 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_113050__207 | 0 | 0.0 | 14.8813 | 0 | [241, 348] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113050__207.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4901 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_083423__118 | 0 | 0.0 | 16.6818 | 0 | [241, 257] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_083423__118.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4902 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_113218__493 | 0 | 0.0 | 9.43658 | 0 | [407, 185] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113218__493.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4903 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_113235__286 | 0 | 0.0 | 17.5726 | 0 | [407, 388] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113235__286.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4904 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_083506__278 | 0 | 0.0 | 8.58093 | 0 | [407, 163] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_083506__278.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4905 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_113149__367 | 0 | 0.0 | 11.4588 | 0 | [405, 236] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_113149__367.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4906 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_113208__112 | 0 | 0.0 | 19.3589 | 0 | [405, 432] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_113208__112.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4907 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_083457__629 | 0 | 0.0 | 14.773 | 0 | [405, 317] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_083457__629.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4908 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_002741__827 | 0 | 0.0 | 17.1914 | 0 | [98, 538] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_002741__827.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4909 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_002758__192 | 0 | 0.0 | 16.4642 | 0 | [98, 515] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_002758__192.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4910 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_002811__713 | 0 | 0.0 | 12.8535 | 3 | [98, 400] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_002811__713.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4911 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | false | false | 5 | 20231228_002829__928 | 0 | 0.0 | 18.1338 | 0 | [98, 567] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_002829__928.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4912 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_002851__624 | 0 | 0.0 | 21.6234 | 0 | [98, 675] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_002851__624.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4913 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_002644__585 | 0 | 0.0 | 6.8126 | 3 | [139, 200] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_002644__585.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4914 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_002653__492 | 0 | 0.0 | 8.4774 | 2 | [139, 255] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_002653__492.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4915 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_002703__111 | 0 | 0.0 | 10.641 | 0 | [139, 324] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_002703__111.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4916 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_002715__352 | 0 | 0.0 | 11.1555 | 3 | [139, 341] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_002715__352.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4917 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_002724__214 | 0 | 0.0 | 9.27323 | 0 | [139, 280] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_002724__214.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4918 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_002528__719 | 0 | 0.0 | 16.8928 | 0 | [240, 476] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_002528__719.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4919 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_002546__692 | 0 | 0.0 | 17.8362 | 0 | [240, 530] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_002546__692.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4920 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_002600__213 | 0 | 0.0 | 13.2534 | 0 | [240, 389] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_002600__213.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4921 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_002614__859 | 0 | 0.0 | 14.2113 | 0 | [240, 418] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_002614__859.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4922 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_002637__763 | 0 | 0.0 | 22.8415 | 0 | [240, 681] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_002637__763.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4923 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_003106__963 | 0 | 0.0 | 19.5505 | 1 | [406, 547] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003106__963.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4924 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_003125__335 | 0 | 0.0 | 18.3974 | 0 | [406, 512] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003125__335.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4925 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_003148__407 | 0 | 0.0 | 22.8296 | 0 | [406, 644] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003148__407.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4926 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_003204__582 | 0 | 0.0 | 16.789 | 3 | [406, 465] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003204__582.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4927 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_003232__969 | 0 | 0.0 | 27.0076 | 0 | [406, 765] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003232__969.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4928 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_002914__320 | 0 | 0.0 | 22.4806 | 1 | [404, 634] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_002914__320.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4929 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_002946__256 | 0 | 0.0 | 31.7954 | 0 | [404, 900] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_002946__256.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4930 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_003006__582 | 0 | 0.0 | 19.7944 | 0 | [404, 554] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_003006__582.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4931 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_003032__849 | 0 | 0.0 | 26.5627 | 0 | [404, 752] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_003032__849.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4932 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_003046__302 | 0 | 0.0 | 14.0232 | 0 | [404, 381] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_003046__302.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4933 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_003545__718 | 0 | 0.0 | 16.4636 | 0 | [98, 407] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_003545__718.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4934 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231228_003604__689 | 0 | 0.0 | 18.936 | 0 | [98, 469] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_003604__689.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4935 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231228_003630__107 | 0 | 0.0 | 25.4726 | 0 | [98, 631] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_003630__107.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4936 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_003651__690 | 0 | 0.0 | 21.219 | 0 | [98, 526] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_003651__690.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4937 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_003708__488 | 0 | 0.0 | 16.6165 | 0 | [98, 411] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_003708__488.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4938 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_003422__170 | 0 | 0.0 | 12.1397 | 0 | [139, 292] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_003422__170.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4939 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_003446__363 | 0 | 0.0 | 23.5476 | 0 | [139, 577] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_003446__363.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4940 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_003506__396 | 0 | 0.0 | 20.4588 | 3 | [139, 501] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_003506__396.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4941 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_003518__240 | 0 | 0.0 | 11.4106 | 0 | [139, 273] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_003518__240.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4942 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231228_003529__988 | 0 | 0.0 | 10.4152 | 0 | [139, 248] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_003529__988.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4943 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_003252__143 | 0 | 0.0 | 20.1184 | 0 | [240, 454] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_003252__143.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4944 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_003308__349 | 0 | 0.0 | 16.2048 | 2 | [240, 378] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_003308__349.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4945 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_003330__984 | 0 | 0.0 | 21.1189 | 0 | [240, 499] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_003330__984.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4946 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_003354__410 | 0 | 0.0 | 24.0177 | 0 | [240, 570] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_003354__410.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4947 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_003410__736 | 0 | 0.0 | 15.8898 | 0 | [240, 370] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_003410__736.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4948 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_003943__146 | 0 | 0.0 | 34.7883 | 0 | [406, 792] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003943__146.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4949 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_003958__518 | 0 | 0.0 | 15.6094 | 0 | [406, 336] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_003958__518.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4950 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004041__882 | 0 | 0.0 | 42.1535 | 0 | [406, 960] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004041__882.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4951 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004100__367 | 0 | 0.0 | 19.1591 | 0 | [406, 422] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004100__367.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4952 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004119__235 | 0 | 0.0 | 18.6671 | 0 | [406, 410] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004119__235.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4953 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_003727__941 | 0 | 0.0 | 18.6675 | 0 | [404, 410] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_003727__941.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4954 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_003758__400 | 0 | 0.0 | 31.2924 | 0 | [404, 711] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_003758__400.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4955 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231228_003818__116 | 0 | 0.0 | 20.5867 | 0 | [404, 457] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_003818__116.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4956 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_003836__150 | 0 | 0.0 | 17.4545 | 1 | [404, 381] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_003836__150.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4957 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_003908__123 | 0 | 0.0 | 31.8039 | 0 | [404, 723] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_003908__123.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4958 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231226_123720__312 | 0 | 0.0 | 20.6902 | 0 | [98, 370] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_123720__312.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4959 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231226_123811__479 | 0 | 0.0 | 50.6522 | 0 | [98, 904] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_123811__479.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4960 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_083854__936 | 0 | 0.0 | 30.2274 | 0 | [98, 552] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_083854__936.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4961 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_123643__213 | 0 | 0.0 | 16.4589 | 0 | [139, 293] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_123643__213.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4962 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_123659__129 | 0 | 0.0 | 16.6243 | 0 | [139, 295] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_123659__129.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4963 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_083824__711 | 0 | 0.0 | 24.3254 | 0 | [139, 439] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_083824__711.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4964 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_123558__231 | 0 | 0.0 | 18.7653 | 0 | [240, 318] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_123558__231.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4965 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_123626__720 | 0 | 0.0 | 28.0938 | 1 | [240, 485] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_123626__720.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4966 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_083759__903 | 0 | 0.0 | 28.8019 | 1 | [240, 348] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_083759__903.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4967 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_124027__812 | 0 | 0.0 | 28.4296 | 0 | [406, 473] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124027__812.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4968 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_124056__362 | 0 | 0.0 | 28.3595 | 0 | [406, 469] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124056__362.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4969 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_084016__307 | 0 | 0.0 | 39.4433 | 0 | [406, 673] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084016__307.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4970 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_123914__648 | 0 | 0.0 | 21.0367 | 2 | [404, 346] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_123914__648.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4971 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_123959__426 | 0 | 0.0 | 44.7343 | 0 | [404, 744] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_123959__426.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4972 | Apple-MacBook-Pro-M1 | q_and_a_extractor | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_083936__668 | 0 | 0.0 | 42.4674 | 0 | [404, 726] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_083936__668.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4973 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_084450__966 | 0 | 0.0 | 62.9222 | 0 | [104, 367] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_084450__966.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4974 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_130656__369 | 0 | 0.0 | 78.4812 | 3 | [104, 460] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_130656__369.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4975 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_130821__420 | 0 | 0.0 | 84.4554 | 0 | [104, 496] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_130821__420.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4976 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_130929__145 | 0 | 0.0 | 68.3589 | 3 | [104, 400] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_130929__145.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4977 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_084347__351 | 0 | 0.0 | 31.534 | 3 | [143, 171] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_084347__351.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4978 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_130350__584 | 0 | 0.0 | 45.334 | 0 | [143, 254] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_130350__584.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4979 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_130500__827 | 0 | 0.0 | 70.153 | 0 | [143, 398] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_130500__827.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4980 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_130537__696 | 0 | 0.0 | 37.0932 | 0 | [143, 203] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_130537__696.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4981 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_084316__512 | 0 | 0.0 | 139.065 | 0 | [244, 652] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_084316__512.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4982 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_130024__970 | 0 | 0.0 | 107.177 | 2 | [244, 540] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_130024__970.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 4983 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_130129__810 | 0 | 0.0 | 65.3052 | 0 | [244, 343] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_130129__810.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4984 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_130304__384 | 0 | 0.0 | 94.4254 | 1 | [244, 525] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_130304__384.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4985 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_084603__748 | 0 | 0.0 | 11.7022 | 0 | [417, 4] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084603__748.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4986 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_131547__743 | 0 | 0.0 | 108.162 | 0 | [417, 570] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_131547__743.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4987 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_131830__318 | 0 | 0.0 | 12.1704 | 0 | [417, 5] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_131830__318.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4988 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_084551__827 | 0 | 0.0 | 60.4446 | 0 | [415, 297] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_084551__827.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4989 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_131129__941 | 0 | 0.0 | 119.689 | 3 | [415, 639] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_131129__941.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4990 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_131254__683 | 0 | 0.0 | 84.9412 | 0 | [415, 440] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_131254__683.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4991 | Apple-MacBook-Pro-M1 | q_and_a_extractor | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_131359__723 | 0 | 0.0 | 64.4689 | 0 | [415, 321] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_131359__723.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4992 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | false | false | 5 | 20231225_113346__103 | 0 | 0.0 | 16.2312 | 0 | [107, 404] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_113346__103.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4993 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_113406__582 | 0 | 0.0 | 19.6671 | 1 | [107, 491] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_113406__582.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4994 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_083546__833 | 0 | 0.0 | 13.3685 | 0 | [107, 329] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_083546__833.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4995 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_113322__743 | 0 | 0.0 | 11.2448 | 0 | [148, 271] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_113322__743.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4996 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_113329__729 | 0 | 0.0 | 7.54025 | 1 | [148, 176] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_113329__729.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 4997 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_083532__438 | 0 | 0.0 | 9.45103 | 0 | [148, 224] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_083532__438.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4998 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_113253__733 | 0 | 0.0 | 17.9369 | 0 | [249, 259] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113253__733.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 4999 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_113311__600 | 0 | 0.0 | 17.3532 | 2 | [249, 409] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113311__600.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5000 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_083523__907 | 0 | 0.0 | 16.7413 | 0 | [249, 243] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_083523__907.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5001 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_113517__443 | 0 | 0.0 | 18.0091 | 2 | [415, 397] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113517__443.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5002 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_113528__112 | 0 | 0.0 | 10.9723 | 2 | [415, 223] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113528__112.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5003 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_083621__183 | 0 | 0.0 | 17.6033 | 2 | [415, 385] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_083621__183.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5004 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_113441__474 | 0 | 0.0 | 20.6363 | 3 | [413, 462] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_113441__474.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5005 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_113459__108 | 0 | 0.0 | 17.7118 | 0 | [413, 390] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_113459__108.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5006 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_083603__578 | 0 | 0.0 | 17.4645 | 2 | [413, 382] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_083603__578.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5007 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231214_082657__300 | 0 | 0.0 | 17.4722 | 0 | [99, 504] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_082657__300.json | 50.0 | missing | missing | missing | |
| 5008 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_094055__728 | 0 | 0.0 | 8.92261 | 0 | [105, 282] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_094055__728.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5009 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 5 | 20231225_094106__106 | 0 | 0.0 | 10.3215 | 0 | [105, 329] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_094106__106.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5010 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231227_022255__498 | 0 | 0.0 | 12.4375 | 3 | [105, 395] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_022255__498.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5011 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231227_080715__491 | 0 | 0.0 | 17.1303 | 1 | [105, 543] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_080715__491.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5012 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231214_082639__853 | 0 | 0.0 | 19.2866 | 0 | [128, 545] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_082639__853.json | 0.0 | missing | missing | missing | |
| 5013 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094035__155 | 0 | 0.0 | 10.4509 | 0 | [146, 327] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_094035__155.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5014 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094046__807 | 0 | 0.0 | 10.3384 | 2 | [146, 324] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_094046__807.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5015 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_022242__911 | 0 | 0.0 | 8.53843 | 3 | [146, 261] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_022242__911.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5016 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_080658__437 | 0 | 0.0 | 14.5429 | 1 | [146, 454] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_080658__437.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5017 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_082620__506 | 0 | 0.0 | 19.4611 | 0 | [229, 507] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_082620__506.json | 50.0 | missing | missing | missing | |
| 5018 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_094010__902 | 0 | 0.0 | 19.4256 | 0 | [247, 423] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094010__902.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5019 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_094025__746 | 0 | 0.0 | 14.32 | 3 | [247, 433] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094025__746.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5020 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_022234__681 | 0 | 0.0 | 18.7157 | 0 | [247, 411] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_022234__681.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5021 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_080643__338 | 0 | 0.0 | 21.6527 | 3 | [247, 497] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_080643__338.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5022 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_082800__691 | 0 | 0.0 | 20.9763 | 0 | [11, 562] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_082800__691.json | 25.0 | missing | missing | missing | |
| 5023 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_094159__797 | 0 | 0.0 | 13.5181 | 3 | [413, 375] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_094159__797.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5024 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_094213__417 | 0 | 0.0 | 13.8042 | 1 | [413, 383] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_094213__417.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5025 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_022319__688 | 0 | 0.0 | 12.6155 | 0 | [413, 344] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_022319__688.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5026 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_080741__258 | 0 | 0.0 | 10.9003 | 3 | [413, 290] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_080741__258.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5027 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_082739__902 | 0 | 0.0 | 28.7807 | 0 | [399, 668] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_082739__902.json | 25.0 | missing | missing | missing | |
| 5028 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_094132__927 | 0 | 0.0 | 11.1501 | 0 | [411, 300] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_094132__927.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5029 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_094145__127 | 0 | 0.0 | 12.8671 | 0 | [411, 355] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_094145__127.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5030 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_022307__618 | 0 | 0.0 | 11.7094 | 0 | [411, 316] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_022307__618.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5031 | Apple-MacBook-Pro-M1 | q_and_a_extractor | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_080730__530 | 0 | 0.0 | 15.164 | 3 | [411, 422] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_080730__530.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5032 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231214_084003__402 | 0 | 0.0 | 17.52 | 0 | [99, 505] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__InJulia__1SHOT__20231214_084003__402.json | 50.0 | missing | missing | missing | |
| 5033 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_100953__276 | 0 | 0.0 | 22.6805 | 0 | [102, 408] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__InJulia__1SHOT__20231225_100953__276.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5034 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | InJulia | 1SHOT | false | false | 5 | 20231225_101019__226 | 0 | 0.0 | 25.5595 | 0 | [102, 455] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__InJulia__1SHOT__20231225_101019__226.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5035 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231227_082015__679 | 0 | 0.0 | 26.6858 | 0 | [102, 469] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__InJulia__1SHOT__20231227_082015__679.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5036 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083945__989 | 0 | 0.0 | 7.9913 | 0 | [128, 222] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_083945__989.json | 50.0 | missing | missing | missing | |
| 5037 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_100858__871 | 0 | 0.0 | 28.1694 | 2 | [141, 500] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_100858__871.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5038 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100931__392 | 0 | 0.0 | 32.6033 | 0 | [141, 578] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_100931__392.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5039 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_081948__679 | 0 | 0.0 | 18.719 | 0 | [141, 327] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_081948__679.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5040 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_083937__592 | 0 | 0.0 | 26.4545 | 0 | [229, 687] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083937__592.json | 50.0 | missing | missing | missing | |
| 5041 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_100807__287 | 0 | 0.0 | 32.5309 | 2 | [242, 389] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100807__287.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5042 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_100830__502 | 0 | 0.0 | 22.5545 | 2 | [242, 379] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100830__502.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5043 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_081930__714 | 0 | 0.0 | 30.8678 | 0 | [242, 356] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_081930__714.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5044 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_084107__150 | 0 | 0.0 | 26.0185 | 0 | [11, 686] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084107__150.json | 25.0 | missing | missing | missing | |
| 5045 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_101233__436 | 0 | 0.0 | 29.8327 | 0 | [405, 475] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_101233__436.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5046 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_101241__738 | 0 | 0.0 | 8.50061 | 0 | [405, 96] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_101241__738.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5047 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_082125__499 | 0 | 0.0 | 35.459 | 1 | [405, 550] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_082125__499.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5048 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_084041__581 | 0 | 0.0 | 22.9724 | 0 | [399, 524] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_084041__581.json | 0.0 | missing | missing | missing | |
| 5049 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_101127__979 | 0 | 0.0 | 25.0731 | 0 | [402, 393] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_101127__979.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5050 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_101203__739 | 0 | 0.0 | 35.8229 | 0 | [402, 578] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_101203__739.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5051 | Apple-MacBook-Pro-M1 | q_and_a_extractor | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_082049__634 | 0 | 0.0 | 34.0951 | 0 | [402, 540] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_082049__634.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5052 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_113705__591 | 0 | 0.0 | 28.7492 | 0 | [96, 1049] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_113705__591.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5053 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_113730__418 | 0 | 0.0 | 24.6802 | 0 | [96, 913] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_113730__418.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5054 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_083714__755 | 0 | 0.0 | 32.5532 | 0 | [96, 1165] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_083714__755.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5055 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_113617__377 | 0 | 0.0 | 42.2106 | 0 | [133, 1463] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_113617__377.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5056 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_113636__356 | 0 | 0.0 | 18.9812 | 0 | [133, 704] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_113636__356.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5057 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_083642__664 | 0 | 0.0 | 15.8399 | 0 | [133, 588] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_083642__664.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5058 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_113534__122 | 0 | 0.0 | 5.19615 | 0 | [232, 31] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113534__122.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5059 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_113535__493 | 0 | 0.0 | 1.19206 | 0 | [232, 19] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_113535__493.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5060 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_083626__113 | 0 | 0.0 | 4.56707 | 0 | [232, 14] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_083626__113.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5061 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_113834__676 | 0 | 0.0 | 7.57878 | 0 | [385, 237] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113834__676.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5062 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_113842__406 | 0 | 0.0 | 8.43451 | 0 | [385, 268] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113842__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5063 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_083730__702 | 0 | 0.0 | 7.77404 | 0 | [385, 242] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_083730__702.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5064 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_113821__889 | 0 | 0.0 | 10.3604 | 0 | [382, 342] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_113821__889.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5065 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_113826__911 | 0 | 0.0 | 5.17462 | 0 | [382, 151] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_113826__911.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5066 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_083722__621 | 0 | 0.0 | 8.22105 | 0 | [382, 262] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_083722__621.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5067 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231214_084144__124 | 0 | 0.0 | 15.371 | 0 | [99, 446] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_084144__124.json | 25.0 | missing | missing | missing | |
| 5068 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_101612__393 | 0 | 0.0 | 35.5435 | 3 | [110, 270] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_101612__393.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5069 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | InJulia | 1SHOT | false | false | 5 | 20231225_101701__565 | 0 | 0.0 | 48.2258 | 0 | [110, 372] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_101701__565.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5070 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231227_082353__821 | 0 | 0.0 | 49.6965 | 0 | [110, 379] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_082353__821.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5071 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084128__103 | 0 | 0.0 | 2.41673 | 0 | [128, 50] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_084128__103.json | 50.0 | missing | missing | missing | |
| 5072 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_101500__548 | 0 | 0.0 | 49.6837 | 3 | [149, 377] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_101500__548.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5073 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_101537__317 | 2 | 0.0 | 36.1898 | 2 | [149, 269] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_101537__317.json | 76.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5074 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_082304__453 | 3 | 0.0 | 46.7901 | 2 | [149, 349] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_082304__453.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5075 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_084126__382 | 0 | 0.0 | 18.8991 | 0 | [229, 492] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084126__382.json | 0.0 | missing | missing | missing | |
| 5076 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_101328__719 | 3 | 0.0 | 46.4892 | 2 | [250, 157] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_101328__719.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5077 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_101410__995 | 3 | 0.0 | 42.4773 | 2 | [250, 301] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_101410__995.json | 81.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5078 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_082217__716 | 2 | 0.0 | 51.8384 | 2 | [250, 205] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_082217__716.json | 76.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5079 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_084242__789 | 0 | 0.0 | 17.5778 | 0 | [11, 476] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084242__789.json | 25.0 | missing | missing | missing | |
| 5080 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_102048__488 | 0 | 0.0 | 59.096 | 3 | [413, 397] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_102048__488.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5081 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_102139__344 | 0 | 0.0 | 51.5749 | 1 | [413, 339] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_102139__344.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5082 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_082602__807 | 2 | 0.0 | 81.0386 | 3 | [413, 542] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_082602__807.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5083 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_084224__867 | 0 | 0.0 | 27.9523 | 0 | [399, 648] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_084224__867.json | 25.0 | missing | missing | missing | |
| 5084 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_101921__612 | 0 | 0.0 | 45.2905 | 3 | [410, 290] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_101921__612.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5085 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_101948__446 | 2 | 0.0 | 27.2924 | 2 | [410, 149] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_101948__446.json | 76.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5086 | Apple-MacBook-Pro-M1 | q_and_a_extractor | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_082441__220 | 0 | 0.0 | 47.4156 | 3 | [410, 285] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_082441__220.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5087 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_112736__623 | 0 | 0.0 | 21.5736 | 0 | [107, 361] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_112736__623.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5088 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_112801__275 | 0 | 0.0 | 24.6148 | 0 | [107, 413] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_112801__275.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5089 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_083310__440 | 0 | 0.0 | 23.0073 | 0 | [107, 384] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_083310__440.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5090 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_112653__985 | 0 | 0.0 | 18.9684 | 0 | [148, 311] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_112653__985.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5091 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_112714__713 | 0 | 0.0 | 20.6183 | 0 | [148, 339] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_112714__713.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5092 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_083247__869 | 0 | 0.0 | 16.9588 | 0 | [148, 275] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_083247__869.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5093 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_112613__745 | 0 | 0.0 | 37.251 | 0 | [249, 451] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_112613__745.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5094 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_112634__995 | 0 | 0.0 | 21.4288 | 0 | [249, 336] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_112634__995.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5095 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_083230__495 | 0 | 0.0 | 30.9054 | 0 | [249, 344] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_083230__495.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5096 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_112946__700 | 0 | 0.0 | 15.3239 | 0 | [415, 207] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_112946__700.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5097 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_113011__238 | 0 | 0.0 | 25.609 | 0 | [415, 379] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_113011__238.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5098 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_083406__933 | 0 | 0.0 | 19.3639 | 0 | [415, 274] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_083406__933.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5099 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_112911__706 | 0 | 0.0 | 21.3917 | 0 | [413, 309] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_112911__706.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5100 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_112930__789 | 0 | 0.0 | 18.8196 | 1 | [413, 266] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_112930__789.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5101 | Apple-MacBook-Pro-M1 | q_and_a_extractor | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_083347__735 | 0 | 0.0 | 36.9014 | 2 | [413, 561] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_083347__735.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5102 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231214_083818__895 | 0 | 0.0 | 15.3515 | 0 | [99, 445] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_083818__895.json | 25.0 | missing | missing | missing | |
| 5103 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | InJulia | 1SHOT | true | true | 5 | 20231225_100647__394 | 0 | 0.0 | 6.46986 | 0 | [108, 362] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_100647__394.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5104 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_100652__864 | 0 | 0.0 | 5.05451 | 0 | [108, 281] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_100652__864.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5105 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231227_081839__591 | 0 | 0.0 | 6.98425 | 0 | [108, 387] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_081839__591.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5106 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083803__252 | 0 | 0.0 | 9.00863 | 0 | [128, 253] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_083803__252.json | 50.0 | missing | missing | missing | |
| 5107 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100631__795 | 0 | 0.0 | 5.04334 | 0 | [145, 273] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_100631__795.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5108 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_100640__647 | 0 | 0.0 | 9.25819 | 0 | [145, 502] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_100640__647.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5109 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_081832__860 | 0 | 0.0 | 9.34676 | 0 | [145, 502] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_081832__860.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5110 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_083754__440 | 0 | 0.0 | 19.8381 | 0 | [229, 517] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083754__440.json | 25.0 | missing | missing | missing | |
| 5111 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_100612__365 | 0 | 0.0 | 10.3598 | 0 | [241, 384] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100612__365.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5112 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_100626__997 | 0 | 0.0 | 13.485 | 0 | [241, 686] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_100626__997.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5113 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_081823__219 | 0 | 0.0 | 14.0333 | 0 | [241, 575] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_081823__219.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5114 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_083911__300 | 0 | 0.0 | 20.768 | 0 | [11, 557] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083911__300.json | 0.0 | missing | missing | missing | |
| 5115 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_100727__544 | 0 | 0.0 | 5.88515 | 0 | [395, 256] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100727__544.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5116 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_100734__217 | 0 | 0.0 | 7.21581 | 0 | [395, 325] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_100734__217.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5117 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_081859__655 | 0 | 0.0 | 8.80301 | 0 | [395, 399] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_081859__655.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5118 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_083850__174 | 0 | 0.0 | 20.0725 | 0 | [399, 450] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_083850__174.json | 50.0 | missing | missing | missing | |
| 5119 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_100708__699 | 0 | 0.0 | 7.53742 | 0 | [393, 339] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_100708__699.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5120 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_100721__178 | 0 | 0.0 | 13.1845 | 0 | [393, 618] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_100721__178.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5121 | Apple-MacBook-Pro-M1 | q_and_a_extractor | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_081850__691 | 0 | 0.0 | 10.9537 | 0 | [393, 502] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_081850__691.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5122 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231214_082846__848 | 0 | 0.0 | 18.5378 | 0 | [99, 533] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_082846__848.json | 25.0 | missing | missing | missing | |
| 5123 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_094325__456 | 0 | 0.0 | 11.3287 | 0 | [107, 360] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_094325__456.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5124 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_094336__967 | 0 | 0.0 | 10.7859 | 0 | [107, 344] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_094336__967.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5125 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_022356__623 | 0 | 0.0 | 9.9457 | 0 | [107, 313] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_022356__623.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5126 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_080827__403 | 0 | 0.0 | 13.054 | 1 | [107, 413] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_080827__403.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5127 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_082827__702 | 0 | 0.0 | 11.2828 | 0 | [128, 320] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_082827__702.json | 50.0 | missing | missing | missing | |
| 5128 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094258__567 | 0 | 0.0 | 12.0129 | 1 | [148, 378] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_094258__567.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5129 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094314__765 | 0 | 0.0 | 15.459 | 2 | [148, 488] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_094314__765.json | 66.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5130 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_022346__172 | 0 | 0.0 | 11.0361 | 0 | [148, 342] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_022346__172.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5131 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_080814__345 | 1 | 0.0 | 11.6294 | 3 | [148, 360] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_080814__345.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5132 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_082816__353 | 0 | 0.0 | 16.0357 | 0 | [229, 414] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_082816__353.json | 0.0 | missing | missing | missing | |
| 5133 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_094233__871 | 0 | 0.0 | 19.8831 | 0 | [249, 430] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094233__871.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5134 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_094246__992 | 0 | 0.0 | 12.7786 | 0 | [249, 382] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094246__992.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5135 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_022334__913 | 0 | 0.0 | 14.9509 | 3 | [249, 284] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_022334__913.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5136 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_080802__111 | 0 | 0.0 | 20.7585 | 0 | [249, 468] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_080802__111.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5137 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_083012__871 | 0 | 0.0 | 31.1516 | 0 | [11, 806] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083012__871.json | 25.0 | missing | missing | missing | |
| 5138 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_094442__644 | 0 | 0.0 | 9.60781 | 0 | [415, 251] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_094442__644.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5139 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_094458__750 | 0 | 0.0 | 15.6798 | 0 | [415, 441] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_094458__750.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5140 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_022422__314 | 0 | 0.0 | 11.3691 | 1 | [415, 304] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_022422__314.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5141 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_080902__149 | 0 | 0.0 | 20.7269 | 1 | [415, 589] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_080902__149.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5142 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_082941__707 | 0 | 0.0 | 33.6933 | 0 | [399, 784] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_082941__707.json | 50.0 | missing | missing | missing | |
| 5143 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_094416__371 | 2 | 0.0 | 10.7378 | 3 | [413, 286] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_094416__371.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5144 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_094433__315 | 0 | 0.0 | 16.9908 | 3 | [413, 481] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_094433__315.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5145 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_022410__203 | 2 | 0.0 | 14.1299 | 3 | [413, 390] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_022410__203.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5146 | Apple-MacBook-Pro-M1 | q_and_a_extractor | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_080841__559 | 0 | 0.0 | 13.5146 | 0 | [413, 370] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_080841__559.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5147 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231214_083049__308 | 0 | 0.0 | 10.0355 | 0 | [99, 290] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_083049__308.json | 25.0 | missing | missing | missing | |
| 5148 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_094908__194 | 0 | 0.0 | 80.5029 | 0 | [104, 596] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_094908__194.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5149 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_095001__774 | 0 | 0.0 | 52.8473 | 0 | [104, 390] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_095001__774.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5150 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231227_022747__432 | 0 | 0.0 | 77.8554 | 0 | [104, 580] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_022747__432.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5151 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231227_081214__197 | 0 | 0.0 | 77.0827 | 0 | [104, 574] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_081214__197.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5152 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_083039__373 | 0 | 0.0 | 10.931 | 1 | [128, 309] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_083039__373.json | 58.3333 | missing | missing | missing | |
| 5153 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094701__305 | 0 | 0.0 | 62.91 | 0 | [143, 451] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_094701__305.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5154 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_094747__465 | 0 | 0.0 | 45.6226 | 0 | [143, 329] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_094747__465.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5155 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_022629__983 | 0 | 0.0 | 63.9752 | 0 | [143, 470] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_022629__983.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5156 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_081057__677 | 0 | 0.0 | 53.4235 | 0 | [143, 390] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_081057__677.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5157 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231214_083028__689 | 0 | 0.0 | 15.687 | 0 | [229, 405] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_083028__689.json | 0.0 | missing | missing | missing | |
| 5158 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_094543__377 | 0 | 0.0 | 44.4544 | 0 | [244, 121] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094543__377.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5159 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_094559__244 | 0 | 0.0 | 15.5151 | 0 | [244, 81] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_094559__244.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5160 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_022525__287 | 0 | 0.0 | 63.5399 | 0 | [244, 278] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_022525__287.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5161 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_081002__348 | 0 | 0.0 | 60.4046 | 3 | [244, 254] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_081002__348.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5162 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_083213__629 | 0 | 0.0 | 20.4187 | 0 | [11, 548] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_083213__629.json | 50.0 | missing | missing | missing | |
| 5163 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_095549__902 | 0 | 0.0 | 79.9459 | 0 | [417, 517] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_095549__902.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5164 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_095721__464 | 0 | 0.0 | 92.412 | 0 | [417, 604] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_095721__464.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5165 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_022955__439 | 0 | 0.0 | 64.3869 | 0 | [417, 414] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_022955__439.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5166 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_083152__848 | 0 | 0.0 | 40.1486 | 0 | [399, 933] | 0.4.0 | 3 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_083152__848.json | 50.0 | missing | missing | missing | |
| 5167 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_095320__947 | 0 | 0.0 | 69.7206 | 3 | [415, 451] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_095320__947.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5168 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_095429__343 | 0 | 0.0 | 68.4094 | 0 | [415, 435] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_095429__343.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5169 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_022851__900 | 0 | 0.0 | 63.7192 | 0 | [415, 413] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_022851__900.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5170 | Apple-MacBook-Pro-M1 | q_and_a_extractor | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_081310__448 | 0 | 0.0 | 56.1029 | 0 | [415, 357] | 0.6.0 | 3 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/q_and_a_extractor/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_081310__448.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5171 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | InJulia | 1SHOT | true | false | 5 | 20231214_085001__207 | 0 | 0.0 | 9.77953 | 0 | [74, 291] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_085001__207.json | 25.0 | missing | missing | missing | |
| 5172 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_085720__167 | 5 | 0.0 | 15.1671 | 5 | [82, 277] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_085720__167.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5173 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231225_085739__613 | 5 | 0.0 | 18.8354 | 5 | [82, 346] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_085739__613.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5174 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | InJulia | 1SHOT | true | true | 5 | 20231227_085640__771 | 5 | 0.0 | 20.1605 | 5 | [82, 368] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_085640__771.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5175 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084951__804 | 1 | 0.0 | 7.61435 | 0 | [103, 215] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_084951__804.json | 55.0 | missing | missing | missing | |
| 5176 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_085703__124 | 0 | 0.0 | 4.53612 | 0 | [120, 68] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_085703__124.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5177 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_085705__672 | 0 | 0.0 | 2.65283 | 0 | [120, 31] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_085705__672.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5178 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_085619__746 | 5 | 0.0 | 10.9073 | 5 | [120, 189] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_085619__746.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5179 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_084943__314 | 1 | 0.0 | 15.1608 | 0 | [201, 401] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084943__314.json | 55.0 | missing | missing | missing | |
| 5180 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_085648__275 | 5 | 0.0 | 25.9908 | 5 | [219, 269] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_085648__275.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5181 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_085658__813 | 0 | 0.0 | 9.72037 | 0 | [219, 151] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_085658__813.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5182 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_085608__905 | 5 | 0.0 | 19.5126 | 5 | [219, 158] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_085608__905.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5183 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_085050__679 | 1 | 0.0 | 11.7442 | 0 | [11, 326] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085050__679.json | 55.0 | missing | missing | missing | |
| 5184 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_085920__566 | 5 | 0.0 | 25.6768 | 5 | [385, 407] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_085920__566.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5185 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_085932__529 | 5 | 0.0 | 11.9565 | 5 | [385, 160] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_085932__529.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5186 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_085730__484 | 1 | 0.0 | 38.0886 | 0 | [385, 616] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_085730__484.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5187 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | false | 5 | 20231214_085038__605 | 0 | 0.0 | 24.0292 | 0 | [374, 564] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_085038__605.json | 25.0 | missing | missing | missing | |
| 5188 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_085836__934 | 5 | 0.0 | 22.2061 | 5 | [382, 350] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_085836__934.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5189 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_085855__137 | 1 | 0.0 | 18.1823 | 0 | [382, 278] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_085855__137.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5190 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_085652__984 | 5 | 0.0 | 12.4699 | 5 | [382, 172] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_085652__984.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5191 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231214_085126__557 | 0 | 0.0 | 13.6638 | 0 | [74, 397] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_085126__557.json | 25.0 | missing | missing | missing | |
| 5192 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | InJulia | 1SHOT | true | false | 5 | 20231225_090019__184 | 0 | 0.0 | 19.9242 | 0 | [56, 372] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_090019__184.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5193 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | InJulia | 1SHOT | false | false | 5 | 20231225_090022__817 | 0 | 0.0 | 3.08442 | 0 | [56, 50] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_090022__817.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5194 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_085111__987 | 1 | 0.0 | 6.76224 | 0 | [103, 190] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_085111__987.json | 55.0 | missing | missing | missing | |
| 5195 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_085954__129 | 0 | 0.0 | 6.72709 | 0 | [57, 121] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_085954__129.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5196 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_085959__406 | 0 | 0.0 | 4.73738 | 0 | [57, 82] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_085959__406.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5197 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_085104__138 | 1 | 0.0 | 14.4655 | 0 | [201, 382] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085104__138.json | 55.0 | missing | missing | missing | |
| 5198 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_085945__318 | 0 | 0.0 | 12.3463 | 0 | [94, 30] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_085945__318.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5199 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_085947__767 | 0 | 0.0 | 2.31938 | 0 | [94, 29] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_085947__767.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5200 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_085232__431 | 1 | 0.0 | 25.9812 | 0 | [11, 686] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085232__431.json | 55.0 | missing | missing | missing | |
| 5201 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_090047__474 | 0 | 0.0 | 1.21617 | 0 | [74, 8] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090047__474.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5202 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_085206__850 | 1 | 0.0 | 27.4067 | 0 | [374, 641] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_085206__850.json | 55.0 | missing | missing | missing | |
| 5203 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_090038__599 | 0 | 0.0 | 3.46579 | 0 | [71, 52] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_090038__599.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5204 | Apple-MacBook-Pro-M1 | timezone_bumper | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_090046__218 | 0 | 0.0 | 7.28874 | 0 | [71, 126] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_090046__218.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5205 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_091943__508 | 5 | 0.0 | 39.4811 | 5 | [73, 237] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_091943__508.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5206 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_092027__565 | 5 | 0.0 | 43.6732 | 5 | [73, 263] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_092027__565.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5207 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_090521__838 | 5 | 0.0 | 26.8841 | 5 | [73, 157] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_090521__838.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5208 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_091833__850 | 5 | 0.0 | 38.651 | 5 | [114, 226] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_091833__850.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5209 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_091903__120 | 5 | 0.0 | 30.0392 | 5 | [114, 172] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_091903__120.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5210 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_090454__289 | 5 | 0.0 | 31.2548 | 5 | [114, 179] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_090454__289.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5211 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_091618__597 | 5 | 0.0 | 65.7138 | 5 | [212, 210] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_091618__597.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5212 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_091754__227 | 5 | 0.0 | 95.9367 | 5 | [212, 557] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_091754__227.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5213 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_090422__838 | 5 | 0.0 | 88.6963 | 5 | [212, 367] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_090422__838.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5214 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_092419__262 | 5 | 0.0 | 44.4627 | 5 | [402, 213] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_092419__262.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5215 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_092459__266 | 5 | 0.0 | 40.2267 | 5 | [402, 187] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_092459__266.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5216 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_090707__474 | 5 | 0.0 | 73.0678 | 5 | [402, 384] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_090707__474.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5217 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092248__481 | 5 | 0.0 | 51.953 | 5 | [400, 258] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092248__481.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5218 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092334__101 | 5 | 0.0 | 46.1205 | 5 | [400, 222] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092334__101.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5219 | Apple-MacBook-Pro-M1 | timezone_bumper | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_090554__211 | 5 | 0.0 | 33.2663 | 5 | [400, 144] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_090554__211.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5220 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_091356__770 | 0 | 0.0 | 9.76489 | 0 | [77, 375] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_091356__770.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5221 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_131925__426 | 0 | 0.0 | 10.0801 | 0 | [77, 388] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_131925__426.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5222 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 5 | 20231227_131934__590 | 0 | 0.0 | 8.74397 | 0 | [77, 337] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_131934__590.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5223 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_131942__173 | 0 | 0.0 | 7.39312 | 0 | [77, 285] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_131942__173.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5224 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_091346__723 | 1 | 0.0 | 7.17703 | 0 | [114, 271] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_091346__723.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5225 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_131858__380 | 0 | 0.0 | 2.94165 | 0 | [114, 105] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_131858__380.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5226 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_131907__489 | 1 | 0.0 | 9.20755 | 0 | [114, 349] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_131907__489.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5227 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_131915__847 | 0 | 0.0 | 8.0069 | 0 | [114, 303] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_131915__847.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5228 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_091339__426 | 0 | 0.0 | 8.71512 | 0 | [203, 185] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_091339__426.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5229 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_131841__119 | 0 | 0.0 | 11.9022 | 0 | [203, 299] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_131841__119.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5230 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_131846__508 | 0 | 0.0 | 4.3235 | 0 | [203, 146] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_131846__508.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5231 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_131855__545 | 0 | 0.0 | 9.21093 | 0 | [203, 332] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_131855__545.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5232 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_091415__205 | 0 | 0.0 | 8.61934 | 0 | [366, 278] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_091415__205.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5233 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_132030__951 | 0 | 0.0 | 11.2339 | 0 | [366, 373] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_132030__951.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5234 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_132048__147 | 0 | 0.0 | 17.271 | 0 | [366, 584] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_132048__147.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5235 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_132056__622 | 0 | 0.0 | 8.0301 | 0 | [366, 257] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_132056__622.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5236 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_091407__367 | 0 | 0.0 | 10.2397 | 0 | [363, 337] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_091407__367.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5237 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_131950__413 | 1 | 0.0 | 8.73254 | 0 | [363, 283] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_131950__413.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5238 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_132001__272 | 0 | 0.0 | 10.5528 | 0 | [363, 349] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_132001__272.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5239 | Apple-MacBook-Pro-M1 | timezone_bumper | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_132019__455 | 0 | 0.0 | 18.197 | 0 | [363, 616] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_132019__455.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5240 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | InJulia | 1SHOT | true | false | 5 | 20231214_084327__135 | 0 | 0.0 | 12.0294 | 0 | [74, 359] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__InJulia__1SHOT__20231214_084327__135.json | 25.0 | missing | missing | missing | |
| 5241 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_084032__375 | 0 | 0.0 | 13.8508 | 0 | [74, 410] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__InJulia__1SHOT__20231225_084032__375.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5242 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | InJulia | 1SHOT | true | false | 5 | 20231225_084042__784 | 0 | 0.0 | 10.549 | 0 | [1, 330] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__InJulia__1SHOT__20231225_084042__784.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5243 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | InJulia | 1SHOT | true | true | 5 | 20231227_084652__623 | 1 | 0.0 | 15.3382 | 0 | [74, 461] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__InJulia__1SHOT__20231227_084652__623.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5244 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084314__641 | 1 | 0.0 | 9.36455 | 0 | [103, 268] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_084314__641.json | 55.0 | missing | missing | missing | |
| 5245 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084009__406 | 1 | 0.0 | 7.60119 | 0 | [103, 215] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_084009__406.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5246 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084018__266 | 1 | 0.0 | 7.54759 | 0 | [1, 235] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_084018__266.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5247 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_084636__917 | 1 | 0.0 | 8.14064 | 0 | [103, 236] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_084636__917.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5248 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_084305__284 | 1 | 0.0 | 22.7925 | 0 | [201, 607] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084305__284.json | 55.0 | missing | missing | missing | |
| 5249 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_083948__102 | 1 | 0.0 | 30.4059 | 0 | [219, 667] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_083948__102.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5250 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_084001__821 | 1 | 0.0 | 12.9921 | 0 | [1, 381] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084001__821.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5251 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_084628__220 | 1 | 0.0 | 25.3701 | 0 | [219, 554] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_084628__220.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5252 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_084419__377 | 0 | 0.0 | 24.3539 | 0 | [11, 650] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084419__377.json | 25.0 | missing | missing | missing | |
| 5253 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_084207__439 | 1 | 0.0 | 23.377 | 0 | [11, 628] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084207__439.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5254 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_084229__388 | 1 | 0.0 | 21.6094 | 0 | [1, 588] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084229__388.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5255 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_084733__353 | 1 | 0.0 | 21.6528 | 0 | [11, 590] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084733__353.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5256 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapTask | 1SHOT | false | false | 5 | 20231214_084355__609 | 0 | 0.0 | 17.6727 | 0 | [374, 400] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_084355__609.json | 0.0 | missing | missing | missing | |
| 5257 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084129__253 | 1 | 0.0 | 24.017 | 0 | [374, 564] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_084129__253.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5258 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084144__999 | 1 | 0.0 | 15.0735 | 0 | [1, 420] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_084144__999.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5259 | Apple-MacBook-Pro-M1 | timezone_bumper | llama2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_084711__425 | 1 | 0.0 | 19.3638 | 0 | [374, 451] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_084711__425.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5260 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | InJulia | 1SHOT | true | false | 5 | 20231214_085312__595 | 0 | 0.0 | 10.4409 | 0 | [74, 312] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__InJulia__1SHOT__20231214_085312__595.json | 25.0 | missing | missing | missing | |
| 5261 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_090137__868 | 5 | 0.0 | 8.57739 | 5 | [74, 283] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__InJulia__1SHOT__20231225_090137__868.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5262 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | InJulia | 1SHOT | true | true | 5 | 20231225_090142__349 | 2 | 0.0 | 5.03804 | 1 | [74, 161] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__InJulia__1SHOT__20231225_090142__349.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5263 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | InJulia | 1SHOT | true | false | 5 | 20231227_085758__238 | 0 | 0.0 | 6.25118 | 0 | [74, 201] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__InJulia__1SHOT__20231227_085758__238.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5264 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231214_085301__310 | 0 | 0.0 | 11.6111 | 0 | [103, 334] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_085301__310.json | 25.0 | missing | missing | missing | |
| 5265 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_090121__915 | 2 | 0.0 | 8.94039 | 1 | [113, 289] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_090121__915.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5266 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_090128__571 | 5 | 0.0 | 7.71555 | 5 | [113, 247] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_090128__571.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5267 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_085751__326 | 5 | 0.0 | 6.65742 | 5 | [113, 209] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_085751__326.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5268 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231214_085250__755 | 0 | 0.0 | 17.5646 | 0 | [201, 467] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085250__755.json | 25.0 | missing | missing | missing | |
| 5269 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090104__646 | 5 | 0.0 | 16.2405 | 5 | [211, 314] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090104__646.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5270 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090112__359 | 5 | 0.0 | 7.44667 | 5 | [211, 221] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090112__359.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5271 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_085745__416 | 5 | 0.0 | 14.1651 | 5 | [211, 250] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_085745__416.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5272 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_085406__806 | 0 | 0.0 | 21.0846 | 0 | [11, 562] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085406__806.json | 0.0 | missing | missing | missing | |
| 5273 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090229__805 | 2 | 0.0 | 12.479 | 1 | [377, 355] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090229__805.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5274 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090237__643 | 5 | 0.0 | 8.09549 | 5 | [377, 214] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090237__643.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5275 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_085819__863 | 2 | 0.0 | 10.1473 | 1 | [377, 278] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_085819__863.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5276 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_085345__247 | 1 | 0.0 | 18.3237 | 0 | [374, 417] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_085345__247.json | 55.0 | missing | missing | missing | |
| 5277 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_090203__532 | 2 | 0.0 | 9.94566 | 1 | [374, 274] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_090203__532.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5278 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_090216__186 | 0 | 0.0 | 13.1322 | 0 | [374, 376] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_090216__186.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5279 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_085808__144 | 2 | 0.0 | 10.7377 | 1 | [374, 297] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_085808__144.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5280 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_183057__907 | 2 | 0.0 | 14.9717 | 1 | [74, 289] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183057__907.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5281 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_183115__631 | 2 | 0.0 | 17.7507 | 1 | [74, 344] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183115__631.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5282 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_183122__490 | 2 | 0.0 | 7.11959 | 1 | [74, 133] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183122__490.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5283 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_183009__206 | 2 | 0.0 | 10.5485 | 1 | [113, 185] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183009__206.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5284 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_183025__114 | 2 | 0.0 | 15.2528 | 1 | [113, 290] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183025__114.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5285 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231227_183042__884 | 0 | 0.0 | 16.988 | 0 | [113, 324] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183042__884.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5286 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_182930__103 | 5 | 0.0 | 16.6186 | 5 | [211, 304] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182930__103.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5287 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_182946__126 | 2 | 0.0 | 15.6496 | 1 | [211, 284] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182946__126.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5288 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_182959__157 | 2 | 0.0 | 13.1617 | 1 | [211, 230] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_182959__157.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5289 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_183220__585 | 2 | 0.0 | 10.0225 | 1 | [377, 154] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183220__585.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5290 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_183238__407 | 2 | 0.0 | 17.5141 | 1 | [377, 301] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183238__407.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5291 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_183256__929 | 0 | 0.0 | 18.3406 | 0 | [377, 315] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183256__929.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5292 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_183138__754 | 0 | 0.0 | 16.6935 | 0 | [374, 285] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183138__754.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5293 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_183155__530 | 5 | 0.0 | 16.4999 | 5 | [374, 280] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183155__530.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5294 | Apple-MacBook-Pro-M1 | timezone_bumper | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_183210__513 | 2 | 0.0 | 14.6743 | 1 | [374, 246] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183210__513.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5295 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_092841__823 | 1 | 0.0 | 4.74709 | 0 | [70, 113] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_092841__823.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5296 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_092849__864 | 1 | 0.0 | 7.48931 | 0 | [70, 185] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_092849__864.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5297 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_090850__294 | 0 | 0.0 | 11.4487 | 0 | [70, 286] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_090850__294.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5298 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_092829__485 | 1 | 0.0 | 3.05739 | 0 | [111, 63] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_092829__485.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5299 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_092836__366 | 1 | 0.0 | 7.6875 | 0 | [111, 185] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_092836__366.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5300 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_090839__595 | 0 | 0.0 | 3.33001 | 0 | [111, 70] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_090839__595.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5301 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_092809__493 | 1 | 0.0 | 22.5607 | 0 | [209, 408] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_092809__493.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5302 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_092826__333 | 1 | 0.0 | 15.9242 | 0 | [209, 381] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_092826__333.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5303 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_090835__141 | 1 | 0.0 | 12.9802 | 0 | [209, 168] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_090835__141.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5304 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093015__369 | 1 | 0.0 | 17.6342 | 0 | [378, 395] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093015__369.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5305 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093023__245 | 1 | 0.0 | 7.71722 | 0 | [378, 147] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093023__245.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5306 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_090927__316 | 1 | 0.0 | 26.437 | 0 | [378, 605] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_090927__316.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5307 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092932__473 | 1 | 0.0 | 16.3475 | 0 | [376, 364] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092932__473.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5308 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092957__406 | 1 | 0.0 | 25.0085 | 0 | [376, 576] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092957__406.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5309 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_090900__706 | 0 | 0.0 | 9.97851 | 0 | [376, 203] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_090900__706.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5310 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_004250__501 | 1 | 0.0 | 10.8377 | 0 | [69, 343] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_004250__501.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5311 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_004256__385 | 1 | 0.0 | 6.24295 | 0 | [69, 193] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_004256__385.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5312 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_004304__124 | 5 | 0.0 | 7.63896 | 5 | [69, 238] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_004304__124.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5313 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_004311__900 | 5 | 0.0 | 6.99294 | 5 | [69, 218] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_004311__900.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5314 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 5 | 20231228_004324__459 | 1 | 0.0 | 12.9356 | 0 | [69, 410] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_004324__459.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5315 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004228__545 | 1 | 0.0 | 2.96869 | 0 | [110, 78] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_004228__545.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5316 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004230__386 | 1 | 0.0 | 2.64896 | 0 | [110, 69] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_004230__386.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5317 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004233__884 | 1 | 0.0 | 2.75102 | 0 | [110, 72] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_004233__884.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5318 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004236__671 | 5 | 0.0 | 2.80437 | 5 | [110, 74] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_004236__671.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5319 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004239__628 | 5 | 0.0 | 3.09971 | 5 | [110, 84] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_004239__628.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5320 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231228_004143__499 | 0 | 0.0 | 22.8254 | 0 | [208, 663] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004143__499.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5321 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004152__522 | 1 | 0.0 | 9.28233 | 0 | [208, 269] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004152__522.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5322 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004201__702 | 5 | 0.0 | 8.65301 | 5 | [208, 249] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004201__702.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5323 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004212__292 | 1 | 0.0 | 11.664 | 0 | [208, 345] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004212__292.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5324 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004224__345 | 1 | 0.0 | 11.9189 | 0 | [208, 353] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004224__345.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5325 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004440__160 | 5 | 0.0 | 8.98066 | 5 | [377, 231] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004440__160.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5326 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004451__781 | 1 | 0.0 | 11.2965 | 0 | [377, 303] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004451__781.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5327 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231228_004503__294 | 0 | 0.0 | 11.4483 | 0 | [377, 308] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004503__294.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5328 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004519__135 | 1 | 0.0 | 16.3821 | 0 | [377, 459] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004519__135.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5329 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_004541__562 | 1 | 0.0 | 21.938 | 5 | [377, 625] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_004541__562.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5330 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004339__494 | 1 | 0.0 | 15.0596 | 0 | [375, 419] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_004339__494.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5331 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004351__289 | 1 | 0.0 | 11.3438 | 0 | [375, 305] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_004351__289.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5332 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004408__815 | 1 | 0.0 | 16.5448 | 0 | [375, 464] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_004408__815.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5333 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004421__661 | 1 | 0.0 | 13.0064 | 0 | [375, 357] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_004421__661.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5334 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004431__385 | 1 | 0.0 | 10.0879 | 0 | [375, 266] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_004431__385.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5335 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_004736__317 | 1 | 0.0 | 8.15433 | 0 | [69, 200] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_004736__317.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5336 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_004747__137 | 1 | 0.0 | 10.2072 | 0 | [69, 253] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_004747__137.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5337 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_004756__352 | 1 | 0.0 | 9.43706 | 0 | [69, 233] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_004756__352.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5338 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_004805__996 | 1 | 0.0 | 8.28405 | 0 | [69, 203] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_004805__996.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5339 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231228_004812__366 | 1 | 0.0 | 7.45734 | 0 | [69, 182] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_004812__366.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5340 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004710__187 | 1 | 0.0 | 6.89194 | 0 | [110, 162] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_004710__187.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5341 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004713__970 | 1 | 0.0 | 3.38911 | 0 | [110, 71] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_004713__970.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5342 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004717__876 | 1 | 0.0 | 3.77885 | 0 | [110, 81] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_004717__876.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5343 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004724__385 | 1 | 0.0 | 6.84671 | 0 | [110, 161] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_004724__385.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5344 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231228_004728__853 | 1 | 0.0 | 3.89385 | 0 | [110, 84] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_004728__853.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5345 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004548__662 | 1 | 0.0 | 7.20853 | 0 | [208, 135] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004548__662.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5346 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004606__966 | 1 | 0.0 | 17.5558 | 0 | [208, 417] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004606__966.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5347 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004622__133 | 1 | 0.0 | 15.6303 | 0 | [208, 369] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004622__133.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5348 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231228_004645__970 | 1 | 0.0 | 23.1409 | 0 | [208, 555] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004645__970.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5349 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231228_004703__183 | 0 | 0.0 | 17.8331 | 0 | [208, 424] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_004703__183.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5350 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_005002__992 | 1 | 0.0 | 18.105 | 0 | [377, 402] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005002__992.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5351 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_005014__877 | 1 | 0.0 | 12.2009 | 0 | [377, 257] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005014__877.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5352 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_005026__187 | 1 | 0.0 | 11.7697 | 0 | [377, 246] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005026__187.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5353 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_005037__250 | 1 | 0.0 | 10.9194 | 0 | [377, 225] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005037__250.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5354 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231228_005051__288 | 5 | 0.0 | 13.4944 | 5 | [377, 288] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005051__288.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5355 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004825__605 | 1 | 0.0 | 12.3583 | 0 | [375, 261] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_004825__605.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5356 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231228_004842__957 | 0 | 0.0 | 17.487 | 0 | [375, 387] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_004842__957.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5357 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004901__240 | 1 | 0.0 | 18.4639 | 0 | [375, 411] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_004901__240.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5358 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004925__926 | 1 | 0.0 | 24.0378 | 5 | [375, 546] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_004925__926.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5359 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231228_004944__463 | 1 | 0.0 | 19.0711 | 0 | [375, 426] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_004944__463.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5360 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_124148__513 | 1 | 0.0 | 9.98767 | 0 | [69, 180] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_124148__513.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5361 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231226_124159__808 | 1 | 0.0 | 11.0499 | 0 | [69, 197] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_124159__808.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5362 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 5 | 20231227_091234__230 | 1 | 0.0 | 11.6593 | 0 | [69, 211] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_091234__230.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5363 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_124133__449 | 1 | 0.0 | 4.07262 | 0 | [110, 64] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_124133__449.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5364 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231226_124138__511 | 5 | 0.0 | 5.01771 | 5 | [110, 82] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_124138__511.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5365 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_091222__925 | 1 | 0.0 | 4.25023 | 0 | [110, 67] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_091222__925.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5366 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231226_124121__136 | 1 | 0.0 | 25.329 | 5 | [208, 448] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_124121__136.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5367 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231226_124129__750 | 0 | 0.0 | 7.56331 | 0 | [208, 120] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_124129__750.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5368 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_091218__856 | 1 | 0.0 | 31.0774 | 0 | [208, 394] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_091218__856.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5369 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231226_124344__597 | 1 | 0.0 | 19.0455 | 0 | [377, 310] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124344__597.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5370 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231226_124431__806 | 0 | 0.0 | 47.0613 | 0 | [377, 735] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124431__806.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5371 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_091330__334 | 5 | 0.0 | 31.1692 | 5 | [377, 531] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_091330__334.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5372 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_124249__480 | 1 | 0.0 | 11.1745 | 0 | [375, 167] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_124249__480.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5373 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231226_124324__837 | 1 | 0.0 | 35.4075 | 5 | [375, 598] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_124324__837.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5374 | Apple-MacBook-Pro-M1 | timezone_bumper | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_091259__677 | 1 | 0.0 | 25.0854 | 0 | [375, 421] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_091259__677.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5375 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_091715__495 | 1 | 0.0 | 56.3707 | 0 | [75, 333] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_091715__495.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5376 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | false | 5 | 20231227_132525__325 | 0 | 0.0 | 50.038 | 0 | [75, 295] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_132525__325.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5377 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_132559__983 | 5 | 0.0 | 33.7019 | 5 | [75, 195] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_132559__983.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5378 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_132637__766 | 1 | 0.0 | 37.766 | 0 | [75, 220] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_132637__766.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5379 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_091619__233 | 1 | 0.0 | 53.8678 | 0 | [114, 312] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_091619__233.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5380 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_132350__315 | 1 | 0.0 | 46.2178 | 0 | [114, 266] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_132350__315.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5381 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_132406__976 | 1 | 0.0 | 16.2475 | 0 | [114, 82] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_132406__976.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5382 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_132435__218 | 1 | 0.0 | 28.8871 | 0 | [114, 160] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_132435__218.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5383 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_091525__841 | 5 | 0.0 | 69.7117 | 5 | [210, 251] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_091525__841.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5384 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_132208__322 | 1 | 0.0 | 72.815 | 0 | [210, 380] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_132208__322.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5385 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_132231__693 | 0 | 0.0 | 22.2539 | 0 | [210, 104] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_132231__693.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5386 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_132304__857 | 0 | 0.0 | 32.5602 | 0 | [210, 167] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_132304__857.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5387 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_091914__163 | 1 | 0.0 | 70.3283 | 1 | [388, 358] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_091914__163.json | 60.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5388 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_133009__269 | 0 | 0.0 | 11.037 | 0 | [388, 5] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133009__269.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5389 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_133111__381 | 1 | 0.0 | 61.7347 | 0 | [388, 308] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133111__381.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5390 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_133225__108 | 1 | 0.0 | 74.7223 | 0 | [388, 384] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133225__108.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5391 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_091803__510 | 1 | 0.0 | 47.9829 | 5 | [386, 226] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_091803__510.json | 80.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5392 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_132738__132 | 5 | 0.0 | 61.3755 | 5 | [386, 306] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_132738__132.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5393 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_132844__675 | 0 | 0.0 | 65.9628 | 0 | [386, 333] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_132844__675.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5394 | Apple-MacBook-Pro-M1 | timezone_bumper | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_132958__310 | 1 | 0.0 | 73.2895 | 0 | [386, 376] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_132958__310.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5395 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_093115__387 | 4 | 0.0 | 8.51272 | 4 | [78, 211] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_093115__387.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5396 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_093121__995 | 5 | 0.0 | 6.33191 | 5 | [78, 154] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_093121__995.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5397 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_090954__142 | 5 | 0.0 | 5.11181 | 5 | [78, 121] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_090954__142.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5398 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_093100__665 | 5 | 0.0 | 6.52583 | 5 | [119, 154] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_093100__665.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5399 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_093106__449 | 1 | 0.0 | 6.33345 | 0 | [119, 149] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_093106__449.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5400 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_090949__828 | 5 | 0.0 | 3.19025 | 5 | [119, 66] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_090949__828.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5401 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_093042__268 | 5 | 0.0 | 18.752 | 5 | [217, 296] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093042__268.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5402 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_093053__827 | 5 | 0.0 | 11.5098 | 5 | [217, 268] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093053__827.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5403 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_090946__638 | 4 | 0.0 | 18.9196 | 4 | [217, 303] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_090946__638.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5404 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093225__856 | 5 | 0.0 | 10.3264 | 5 | [386, 209] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093225__856.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5405 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093230__435 | 1 | 0.0 | 5.18387 | 0 | [386, 78] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093230__435.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5406 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_091023__931 | 1 | 0.0 | 13.8186 | 0 | [386, 294] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_091023__931.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5407 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_093207__552 | 4 | 0.0 | 19.3037 | 4 | [384, 435] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_093207__552.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5408 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_093214__746 | 5 | 0.0 | 7.1739 | 5 | [384, 133] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_093214__746.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5409 | Apple-MacBook-Pro-M1 | timezone_bumper | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_091009__670 | 1 | 0.0 | 14.7093 | 0 | [384, 320] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_091009__670.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5410 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231214_084452__103 | 1 | 0.0 | 11.7372 | 0 | [74, 349] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_084452__103.json | 55.0 | missing | missing | missing | |
| 5411 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_084329__968 | 1 | 0.0 | 12.0158 | 0 | [76, 391] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_084329__968.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5412 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231225_084334__871 | 1 | 0.0 | 4.56678 | 0 | [76, 141] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_084334__871.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5413 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 5 | 20231227_084805__904 | 5 | 0.0 | 5.7173 | 5 | [76, 179] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_084805__904.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5414 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084440__347 | 1 | 0.0 | 8.58027 | 0 | [103, 245] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_084440__347.json | 55.0 | missing | missing | missing | |
| 5415 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084311__311 | 1 | 0.0 | 4.38059 | 0 | [117, 130] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_084311__311.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5416 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084317__206 | 2 | 0.0 | 5.40921 | 5 | [117, 165] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_084317__206.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5417 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_084800__741 | 1 | 0.0 | 4.27726 | 0 | [117, 125] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_084800__741.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5418 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_084431__778 | 1 | 0.0 | 12.1676 | 0 | [201, 317] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084431__778.json | 55.0 | missing | missing | missing | |
| 5419 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_084307__798 | 0 | 0.0 | 13.8897 | 0 | [215, 426] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084307__798.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5420 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_084755__613 | 5 | 0.0 | 22.0678 | 5 | [215, 523] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_084755__613.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5421 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_084542__317 | 0 | 0.0 | 11.4392 | 0 | [11, 318] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084542__317.json | 0.0 | missing | missing | missing | |
| 5422 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_084431__936 | 1 | 0.0 | 14.3154 | 0 | [384, 407] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084431__936.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5423 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231225_084450__105 | 0 | 0.0 | 19.1579 | 0 | [384, 557] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084450__105.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5424 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_084832__241 | 5 | 0.0 | 14.8027 | 5 | [384, 418] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084832__241.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5425 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_084531__491 | 1 | 0.0 | 22.7711 | 0 | [374, 532] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_084531__491.json | 55.0 | missing | missing | missing | |
| 5426 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084409__381 | 1 | 0.0 | 10.0754 | 0 | [382, 273] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_084409__381.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5427 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084416__990 | 1 | 0.0 | 6.74894 | 0 | [382, 166] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_084416__990.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5428 | Apple-MacBook-Pro-M1 | timezone_bumper | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_084817__988 | 0 | 0.0 | 11.7032 | 0 | [382, 321] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_084817__988.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5429 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | InJulia | 1SHOT | true | false | 5 | 20231214_085646__465 | 0 | 0.0 | 10.6528 | 0 | [74, 318] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__InJulia__1SHOT__20231214_085646__465.json | 25.0 | missing | missing | missing | |
| 5430 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_090453__406 | 5 | 0.0 | 14.3116 | 5 | [77, 260] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__InJulia__1SHOT__20231225_090453__406.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5431 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231225_090503__152 | 5 | 0.0 | 9.37374 | 5 | [77, 166] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__InJulia__1SHOT__20231225_090503__152.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5432 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | InJulia | 1SHOT | true | true | 5 | 20231227_085925__624 | 5 | 0.0 | 9.8162 | 5 | [77, 173] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__InJulia__1SHOT__20231227_085925__624.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5433 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_085635__500 | 1 | 0.0 | 8.31986 | 0 | [103, 236] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_085635__500.json | 55.0 | missing | missing | missing | |
| 5434 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_090436__456 | 0 | 0.0 | 2.92999 | 0 | [116, 37] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_090436__456.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5435 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_090439__313 | 0 | 0.0 | 3.32145 | 0 | [116, 44] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_090439__313.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5436 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_085915__242 | 2 | 0.0 | 14.4734 | 1 | [116, 255] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_085915__242.json | 65.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5437 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_085626__932 | 1 | 0.0 | 20.8274 | 0 | [201, 555] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085626__932.json | 55.0 | missing | missing | missing | |
| 5438 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_090408__658 | 0 | 0.0 | 16.7117 | 0 | [214, 106] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090408__658.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5439 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090433__386 | 4 | 0.0 | 25.1049 | 4 | [214, 434] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090433__386.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5440 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_085900__133 | 0 | 0.0 | 13.6847 | 0 | [214, 52] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_085900__133.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5441 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_085733__701 | 1 | 0.0 | 16.0575 | 0 | [11, 440] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085733__701.json | 55.0 | missing | missing | missing | |
| 5442 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090648__787 | 1 | 0.0 | 26.2603 | 0 | [380, 420] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090648__787.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5443 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090717__115 | 4 | 0.0 | 28.8697 | 4 | [380, 465] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090717__115.json | 90.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5444 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_085953__728 | 1 | 0.0 | 16.1163 | 0 | [380, 238] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_085953__728.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5445 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_085717__982 | 1 | 0.0 | 21.294 | 0 | [374, 495] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_085717__982.json | 55.0 | missing | missing | missing | |
| 5446 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_090544__919 | 0 | 0.0 | 20.8684 | 0 | [377, 325] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_090544__919.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5447 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 5 | 20231225_090621__804 | 0 | 0.0 | 37.5861 | 0 | [377, 615] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_090621__804.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5448 | Apple-MacBook-Pro-M1 | timezone_bumper | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 5 | 20231227_085937__714 | 0 | 0.0 | 12.0577 | 0 | [377, 165] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_085937__714.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5449 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | true | false | 5 | 20231225_093427__311 | 0 | 0.0 | 18.2586 | 0 | [71, 695] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_093427__311.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5450 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231225_093441__448 | 0 | 0.0 | 13.5181 | 0 | [71, 521] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_093441__448.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5451 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 5 | 20231227_091115__928 | 0 | 0.0 | 18.5206 | 0 | [71, 697] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_091115__928.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5452 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_093334__712 | 0 | 0.0 | 28.6052 | 0 | [108, 1041] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_093334__712.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5453 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 5 | 20231225_093409__210 | 0 | 0.0 | 34.5877 | 0 | [108, 1235] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_093409__210.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5454 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231227_091057__515 | 0 | 0.0 | 23.8123 | 0 | [108, 872] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_091057__515.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5455 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_093236__943 | 0 | 0.0 | 5.66503 | 0 | [197, 54] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093236__943.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5456 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_093305__137 | 0 | 0.0 | 29.5708 | 0 | [197, 1045] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_093305__137.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5457 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_091033__259 | 0 | 0.0 | 10.0513 | 0 | [197, 233] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_091033__259.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5458 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231225_093610__604 | 0 | 0.0 | 4.21513 | 0 | [360, 117] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093610__604.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5459 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_093618__259 | 1 | 0.0 | 7.87215 | 0 | [360, 254] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_093618__259.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5460 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231227_091147__139 | 0 | 0.0 | 23.3238 | 0 | [360, 787] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_091147__139.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5461 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_093602__978 | 1 | 0.0 | 8.89339 | 0 | [357, 291] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_093602__978.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5462 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_093605__408 | 0 | 0.0 | 3.37804 | 0 | [357, 84] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_093605__408.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5463 | Apple-MacBook-Pro-M1 | timezone_bumper | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_091123__461 | 1 | 0.0 | 8.09956 | 0 | [357, 260] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_091123__461.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5464 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 5 | 20231214_085819__418 | 0 | 0.0 | 12.5925 | 0 | [74, 374] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_085819__418.json | 25.0 | missing | missing | missing | |
| 5465 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_091025__616 | 5 | 0.0 | 40.2628 | 5 | [85, 314] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_091025__616.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5466 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231225_091047__232 | 5 | 0.0 | 21.6464 | 5 | [85, 162] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_091047__232.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5467 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 5 | 20231227_090137__387 | 5 | 0.0 | 20.2226 | 5 | [85, 150] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_090137__387.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5468 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_085807__811 | 1 | 0.0 | 10.5712 | 0 | [103, 304] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_085807__811.json | 55.0 | missing | missing | missing | |
| 5469 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_090919__134 | 1 | 0.0 | 23.8352 | 0 | [124, 175] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_090919__134.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5470 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_090945__494 | 5 | 0.0 | 25.8532 | 5 | [124, 191] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_090945__494.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5471 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_090117__204 | 5 | 0.0 | 33.1074 | 5 | [124, 248] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_090117__204.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5472 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_085756__401 | 1 | 0.0 | 22.4553 | 0 | [201, 597] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085756__401.json | 55.0 | missing | missing | missing | |
| 5473 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090809__435 | 5 | 0.0 | 51.8628 | 5 | [222, 206] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090809__435.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5474 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090855__931 | 5 | 0.0 | 45.9755 | 5 | [222, 335] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090855__931.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5475 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_090044__977 | 5 | 0.0 | 50.9888 | 5 | [222, 208] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_090044__977.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5476 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_085923__100 | 1 | 0.0 | 15.7539 | 0 | [11, 432] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085923__100.json | 55.0 | missing | missing | missing | |
| 5477 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_091418__734 | 5 | 0.0 | 27.38 | 5 | [388, 153] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_091418__734.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5478 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_091512__293 | 3 | 0.0 | 53.763 | 4 | [388, 360] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_091512__293.json | 85.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5479 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_090254__424 | 5 | 0.0 | 33.8749 | 5 | [388, 203] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_090254__424.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5480 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_085908__237 | 1 | 0.0 | 30.6426 | 0 | [374, 726] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_085908__237.json | 55.0 | missing | missing | missing | |
| 5481 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_091314__809 | 5 | 0.0 | 43.2596 | 5 | [385, 278] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_091314__809.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5482 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_091351__701 | 5 | 0.0 | 37.2954 | 5 | [385, 231] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_091351__701.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5483 | Apple-MacBook-Pro-M1 | timezone_bumper | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_090220__890 | 0 | 0.0 | 42.394 | 0 | [385, 270] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_090220__890.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5484 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_092609__464 | 1 | 0.0 | 16.0192 | 0 | [78, 271] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_092609__464.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5485 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231225_092614__625 | 5 | 0.0 | 5.10948 | 5 | [78, 78] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_092614__625.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5486 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 5 | 20231227_090745__881 | 1 | 0.0 | 5.36729 | 0 | [78, 82] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_090745__881.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5487 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_092546__569 | 1 | 0.0 | 15.0222 | 0 | [119, 248] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_092546__569.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5488 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_092553__543 | 1 | 0.0 | 7.01194 | 0 | [119, 107] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_092553__543.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5489 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_090740__781 | 5 | 0.0 | 11.8392 | 5 | [119, 191] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_090740__781.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5490 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_092522__952 | 1 | 0.0 | 22.4848 | 0 | [217, 199] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_092522__952.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5491 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231225_092531__467 | 0 | 0.0 | 8.99421 | 0 | [217, 128] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_092531__467.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5492 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 5 | 20231227_090728__253 | 0 | 0.0 | 20.108 | 0 | [217, 171] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_090728__253.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5493 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_092728__570 | 1 | 0.0 | 10.2903 | 0 | [386, 123] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_092728__570.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5494 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_092747__145 | 1 | 0.0 | 19.1297 | 0 | [386, 273] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_092747__145.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5495 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_090822__616 | 5 | 0.0 | 17.372 | 5 | [386, 242] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_090822__616.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5496 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092654__333 | 1 | 0.0 | 18.0566 | 0 | [384, 259] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092654__333.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5497 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_092717__707 | 1 | 0.0 | 22.6462 | 0 | [384, 336] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_092717__707.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5498 | Apple-MacBook-Pro-M1 | timezone_bumper | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_090805__233 | 1 | 0.0 | 19.8779 | 0 | [384, 288] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_090805__233.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5499 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231214_085448__704 | 0 | 0.0 | 16.9289 | 0 | [74, 498] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_085448__704.json | 25.0 | missing | missing | missing | |
| 5500 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231225_090307__930 | 0 | 0.0 | 5.82786 | 0 | [81, 334] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_090307__930.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5501 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | InJulia | 1SHOT | true | false | 5 | 20231225_090313__673 | 0 | 0.0 | 5.98495 | 0 | [81, 344] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_090313__673.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5502 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | InJulia | 1SHOT | false | false | 5 | 20231227_085834__970 | 0 | 0.0 | 3.67644 | 0 | [81, 207] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_085834__970.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5503 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_085431__459 | 1 | 0.0 | 7.06711 | 0 | [103, 199] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_085431__459.json | 55.0 | missing | missing | missing | |
| 5504 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_090300__600 | 1 | 0.0 | 3.48262 | 0 | [118, 193] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_090300__600.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5505 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertAsk | 1SHOT | false | false | 5 | 20231225_090302__705 | 0 | 0.0 | 1.4102 | 0 | [118, 69] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_090302__705.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5506 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_085830__868 | 1 | 0.0 | 1.63907 | 0 | [118, 81] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_085830__868.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5507 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_085424__836 | 1 | 0.0 | 17.3931 | 0 | [201, 462] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085424__836.json | 55.0 | missing | missing | missing | |
| 5508 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231225_090248__367 | 0 | 0.0 | 10.9244 | 0 | [204, 426] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090248__367.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5509 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_090256__979 | 1 | 0.0 | 8.60965 | 0 | [204, 454] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_090256__979.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5510 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_085828__610 | 1 | 0.0 | 9.59217 | 0 | [204, 361] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_085828__610.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5511 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231214_085605__293 | 0 | 0.0 | 39.8487 | 0 | [11, 1010] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_085605__293.json | 25.0 | missing | missing | missing | |
| 5512 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090347__914 | 1 | 0.0 | 6.35457 | 0 | [368, 291] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090347__914.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5513 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_090351__219 | 1 | 0.0 | 3.95271 | 0 | [368, 162] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_090351__219.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5514 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_085846__446 | 1 | 0.0 | 5.35738 | 0 | [368, 235] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_085846__446.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5515 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_085526__943 | 1 | 0.0 | 20.1678 | 0 | [374, 465] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_085526__943.json | 55.0 | missing | missing | missing | |
| 5516 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 5 | 20231225_090336__144 | 0 | 0.0 | 8.91353 | 0 | [366, 422] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_090336__144.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5517 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_090340__683 | 1 | 0.0 | 3.81081 | 0 | [366, 154] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_090340__683.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5518 | Apple-MacBook-Pro-M1 | timezone_bumper | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_085841__105 | 0 | 0.0 | 7.37699 | 0 | [366, 339] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_085841__105.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5519 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | InJulia | 1SHOT | true | false | 5 | 20231214_084634__359 | 0 | 0.0 | 16.8786 | 0 | [74, 497] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_084634__359.json | 25.0 | missing | missing | missing | |
| 5520 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_084553__578 | 1 | 0.0 | 15.1058 | 0 | [78, 491] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_084553__578.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5521 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231225_084601__632 | 5 | 0.0 | 8.18164 | 5 | [78, 263] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_084601__632.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5522 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | InJulia | 1SHOT | true | true | 5 | 20231227_084911__780 | 1 | 0.0 | 10.8221 | 0 | [78, 345] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_084911__780.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5523 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084617__751 | 1 | 0.0 | 13.2075 | 0 | [103, 381] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_084617__751.json | 55.0 | missing | missing | missing | |
| 5524 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084528__421 | 5 | 0.0 | 6.17634 | 5 | [119, 190] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_084528__421.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5525 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084537__794 | 1 | 0.0 | 9.13268 | 0 | [119, 288] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_084537__794.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5526 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_084900__314 | 1 | 0.0 | 6.86598 | 0 | [119, 211] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_084900__314.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5527 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_084604__216 | 1 | 0.0 | 21.903 | 0 | [201, 583] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084604__216.json | 55.0 | missing | missing | missing | |
| 5528 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_084514__949 | 1 | 0.0 | 24.2235 | 0 | [217, 582] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084514__949.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5529 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_084522__413 | 1 | 0.0 | 7.69081 | 0 | [217, 223] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084522__413.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5530 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231227_084853__821 | 1 | 0.0 | 21.0202 | 0 | [217, 483] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_084853__821.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5531 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 5 | 20231214_084756__856 | 0 | 0.0 | 28.9025 | 0 | [11, 760] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084756__856.json | 0.0 | missing | missing | missing | |
| 5532 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_084649__607 | 5 | 0.0 | 12.0454 | 5 | [386, 330] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084649__607.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5533 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_084659__181 | 5 | 0.0 | 9.24105 | 5 | [386, 241] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_084659__181.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5534 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231227_084932__155 | 5 | 0.0 | 11.84 | 5 | [386, 321] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_084932__155.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5535 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_084727__246 | 1 | 0.0 | 44.94 | 0 | [374, 1054] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_084727__246.json | 55.0 | missing | missing | missing | |
| 5536 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084626__844 | 5 | 0.0 | 8.21717 | 5 | [384, 212] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_084626__844.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5537 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_084637__916 | 5 | 0.0 | 10.7526 | 5 | [384, 294] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_084637__916.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5538 | Apple-MacBook-Pro-M1 | timezone_bumper | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 5 | 20231227_084921__444 | 5 | 0.0 | 9.60602 | 5 | [384, 255] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_084921__444.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5539 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231214_084830__817 | 0 | 0.0 | 10.0761 | 0 | [74, 300] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_084830__817.json | 25.0 | missing | missing | missing | |
| 5540 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231225_085024__650 | 0 | 0.0 | 63.4636 | 0 | [75, 482] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_085024__650.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5541 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | InJulia | 1SHOT | true | true | 5 | 20231225_085138__758 | 1 | 0.0 | 74.5738 | 0 | [75, 565] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_085138__758.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5542 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | InJulia | 1SHOT | true | false | 5 | 20231227_085217__462 | 0 | 0.0 | 71.4576 | 0 | [75, 538] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_085217__462.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5543 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231214_084820__259 | 1 | 0.0 | 7.67446 | 0 | [103, 218] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_084820__259.json | 55.0 | missing | missing | missing | |
| 5544 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084857__606 | 1 | 0.0 | 13.3624 | 0 | [114, 86] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_084857__606.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5545 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231225_084920__376 | 1 | 0.0 | 22.5813 | 0 | [114, 159] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_084920__376.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5546 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 5 | 20231227_085105__943 | 1 | 0.0 | 49.4696 | 0 | [114, 365] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_085105__943.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5547 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231214_084812__384 | 1 | 0.0 | 15.9029 | 0 | [201, 422] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_084812__384.json | 55.0 | missing | missing | missing | |
| 5548 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_084819__596 | 1 | 0.0 | 80.5485 | 0 | [210, 404] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084819__596.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5549 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 5 | 20231225_084844__582 | 1 | 0.0 | 24.1854 | 0 | [210, 154] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_084844__582.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5550 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | false | false | 5 | 20231227_085016__934 | 0 | 0.0 | 43.031 | 0 | [210, 126] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_085016__934.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5551 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231214_084928__617 | 1 | 0.0 | 24.7752 | 0 | [11, 659] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_084928__617.json | 55.0 | missing | missing | missing | |
| 5552 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_085506__924 | 1 | 0.0 | 19.5945 | 0 | [388, 86] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_085506__924.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5553 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 5 | 20231225_085622__224 | 1 | 0.0 | 75.6569 | 0 | [388, 505] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_085622__224.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5554 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | false | 5 | 20231227_085549__170 | 0 | 0.0 | 129.041 | 0 | [388, 879] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_085549__170.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5555 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231214_084903__326 | 1 | 0.0 | 20.8251 | 0 | [374, 481] | 0.4.0 | 5 | 1.0 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_084903__326.json | 55.0 | missing | missing | missing | |
| 5556 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_085407__347 | 5 | 0.0 | 24.9873 | 5 | [386, 127] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_085407__347.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5557 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 5 | 20231225_085447__339 | 1 | 0.0 | 39.2992 | 0 | [386, 236] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_085447__339.json | 55.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5558 | Apple-MacBook-Pro-M1 | timezone_bumper | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 5 | 20231227_085339__370 | 0 | 0.0 | 82.8339 | 0 | [386, 552] | 0.6.0 | 5 | 1.0 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/timezone_bumper/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_085339__370.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5559 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | InJulia | 1SHOT | true | false | 6 | 20231214_090625__335 | 0 | 0.0 | 9.72051 | 0 | [65, 290] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231214_090625__335.json | 25.0 | missing | missing | missing | |
| 5560 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | InJulia | 1SHOT | true | true | 6 | 20231225_073136__448 | 1 | 0.0 | 20.8466 | 2 | [73, 377] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_073136__448.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5561 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | InJulia | 1SHOT | true | true | 6 | 20231225_073148__216 | 5 | 0.0 | 11.969 | 2 | [73, 212] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231225_073148__216.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5562 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | InJulia | 1SHOT | true | true | 6 | 20231227_092722__153 | 5 | 0.0 | 9.92689 | 2 | [73, 176] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__InJulia__1SHOT__20231227_092722__153.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5563 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231214_090616__453 | 0 | 0.0 | 12.6159 | 0 | [94, 369] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231214_090616__453.json | 25.0 | missing | missing | missing | |
| 5564 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073109__393 | 5 | 0.0 | 8.43632 | 2 | [111, 140] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_073109__393.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5565 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_073116__340 | 0 | 0.0 | 6.42257 | 0 | [111, 103] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231225_073116__340.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5566 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_092712__712 | 0 | 0.0 | 7.36274 | 0 | [111, 122] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertAsk__1SHOT__20231227_092712__712.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5567 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231214_090603__297 | 0 | 0.0 | 15.4316 | 0 | [175, 418] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090603__297.json | 25.0 | missing | missing | missing | |
| 5568 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_073047__989 | 0 | 0.0 | 23.0735 | 0 | [193, 190] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073047__989.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5569 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_073100__720 | 5 | 0.0 | 13.6661 | 2 | [193, 218] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073100__720.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5570 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_092705__259 | 5 | 0.0 | 24.7619 | 2 | [193, 257] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092705__259.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5571 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_090727__323 | 0 | 0.0 | 16.9242 | 0 | [11, 463] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090727__323.json | 0.0 | missing | missing | missing | |
| 5572 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_073320__589 | 0 | 0.0 | 16.3373 | 2 | [376, 239] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073320__589.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5573 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073341__243 | 0 | 0.0 | 20.8104 | 0 | [376, 319] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073341__243.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5574 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_092748__732 | 5 | 0.0 | 13.4991 | 2 | [376, 192] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092748__732.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5575 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 6 | 20231214_090710__190 | 0 | 0.0 | 27.8737 | 0 | [365, 662] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231214_090710__190.json | 50.0 | missing | missing | missing | |
| 5576 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_073242__878 | 2 | 0.0 | 17.4511 | 2 | [373, 259] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_073242__878.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5577 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_073304__754 | 0 | 0.0 | 21.3841 | 0 | [373, 327] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231225_073304__754.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5578 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-instruct | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_092735__953 | 0 | 0.0 | 12.1584 | 1 | [373, 168] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-instruct/evaluation__JuliaRecapTask__1SHOT__20231227_092735__953.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5579 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | InJulia | 1SHOT | true | false | 6 | 20231214_090808__677 | 0 | 0.0 | 12.3796 | 0 | [65, 368] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__InJulia__1SHOT__20231214_090808__677.json | 25.0 | missing | missing | missing | |
| 5580 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | InJulia | 1SHOT | true | true | 6 | 20231225_073421__129 | 0 | 0.0 | 7.84252 | 2 | [47, 139] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_073421__129.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5581 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | InJulia | 1SHOT | false | false | 6 | 20231225_073432__397 | 0 | 0.0 | 11.0038 | 0 | [47, 194] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__InJulia__1SHOT__20231225_073432__397.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5582 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231214_090756__945 | 0 | 0.0 | 10.7007 | 0 | [94, 313] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231214_090756__945.json | 25.0 | missing | missing | missing | |
| 5583 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073408__771 | 0 | 0.0 | 7.26065 | 0 | [48, 127] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_073408__771.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5584 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231225_073413__579 | 0 | 0.0 | 5.04507 | 0 | [48, 86] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertAsk__1SHOT__20231225_073413__579.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5585 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231214_090745__660 | 0 | 0.0 | 17.5796 | 0 | [175, 478] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090745__660.json | 50.0 | missing | missing | missing | |
| 5586 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_073355__352 | 0 | 0.0 | 13.1916 | 0 | [68, 45] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073355__352.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5587 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_073401__661 | 0 | 0.0 | 6.37326 | 0 | [68, 107] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073401__661.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5588 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_090853__120 | 0 | 0.0 | 10.7516 | 0 | [11, 297] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090853__120.json | 0.0 | missing | missing | missing | |
| 5589 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073448__912 | 0 | 0.0 | 1.30662 | 0 | [65, 9] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073448__912.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5590 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073502__744 | 0 | 0.0 | 13.3298 | 0 | [65, 210] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073502__744.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5591 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapTask | 1SHOT | true | true | 6 | 20231214_090842__846 | 0 | 0.0 | 19.358 | 1 | [365, 447] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231214_090842__846.json | 62.5 | missing | missing | missing | |
| 5592 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapTask | 1SHOT | true | false | 6 | 20231225_073446__924 | 0 | 0.0 | 6.06632 | 0 | [62, 103] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_073446__924.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5593 | Apple-MacBook-Pro-M1 | wrap_string | codellama:13b-python | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_073447__382 | 0 | 0.0 | 1.06626 | 0 | [62, 8] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/codellama:13b-python/evaluation__JuliaRecapTask__1SHOT__20231225_073447__382.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5594 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_075628__848 | 0 | 0.0 | 59.0252 | 1 | [65, 328] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_075628__848.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5595 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_075713__902 | 0 | 0.0 | 43.7698 | 1 | [65, 233] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_075713__902.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5596 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_093836__439 | 0 | 0.0 | 64.2913 | 1 | [65, 391] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_093836__439.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5597 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_075417__621 | 0 | 0.0 | 60.2185 | 1 | [106, 321] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_075417__621.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5598 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231225_075528__670 | 0 | 0.0 | 69.9139 | 0 | [106, 402] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_075528__670.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5599 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_093731__258 | 0 | 0.0 | 93.3468 | 0 | [106, 561] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_093731__258.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5600 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231225_075209__622 | 0 | 0.0 | 80.5325 | 0 | [187, 273] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_075209__622.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5601 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_075314__780 | 0 | 0.0 | 64.1921 | 0 | [187, 363] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_075314__780.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5602 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_093557__864 | 0 | 0.0 | 58.4494 | 1 | [187, 187] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_093557__864.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5603 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_080319__125 | 0 | 0.0 | 86.019 | 1 | [394, 444] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_080319__125.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5604 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_080438__663 | 6 | 0.0 | 77.0306 | 2 | [394, 385] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_080438__663.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5605 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_094112__111 | 3 | 0.0 | 55.5684 | 2 | [394, 279] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094112__111.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5606 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_080054__598 | 0 | 0.0 | 102.021 | 0 | [392, 545] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_080054__598.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5607 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_080151__170 | 5 | 0.0 | 56.3985 | 2 | [392, 269] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_080151__170.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5608 | Apple-MacBook-Pro-M1 | wrap_string | deepseek-coder:33b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_094016__865 | 0 | 0.0 | 100.144 | 0 | [392, 545] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/deepseek-coder:33b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_094016__865.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5609 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 6 | 20231227_094850__763 | 0 | 0.0 | 12.2957 | 0 | [63, 475] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_094850__763.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5610 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 6 | 20231227_133332__431 | 0 | 0.0 | 8.17932 | 0 | [63, 319] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_133332__431.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5611 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | true | 6 | 20231227_133340__643 | 0 | 0.0 | 7.88626 | 0 | [63, 308] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_133340__643.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5612 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | InJulia | 1SHOT | true | false | 6 | 20231227_133351__437 | 0 | 0.0 | 10.9243 | 0 | [63, 424] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__InJulia__1SHOT__20231227_133351__437.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5613 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231227_094838__305 | 0 | 0.0 | 9.04442 | 0 | [100, 344] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_094838__305.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5614 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_133306__633 | 0 | 0.0 | 8.49489 | 0 | [100, 323] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_133306__633.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5615 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231227_133318__832 | 0 | 0.0 | 12.4017 | 0 | [100, 470] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_133318__832.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5616 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231227_133324__452 | 0 | 0.0 | 5.32769 | 0 | [100, 200] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_133324__452.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5617 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_094829__867 | 0 | 0.0 | 7.46961 | 0 | [179, 142] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094829__867.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5618 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_133237__971 | 0 | 0.0 | 11.0759 | 0 | [179, 285] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133237__971.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5619 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_133241__680 | 0 | 0.0 | 4.42447 | 0 | [179, 154] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133241__680.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5620 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_133257__785 | 0 | 0.0 | 16.2088 | 0 | [179, 592] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133257__785.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5621 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_094915__772 | 0 | 0.0 | 12.9103 | 0 | [352, 437] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094915__772.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5622 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_133424__864 | 0 | 0.0 | 11.284 | 0 | [352, 379] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133424__864.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5623 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20231227_133433__902 | 0 | 0.0 | 8.9297 | 0 | [352, 295] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133433__902.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5624 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_133444__708 | 0 | 0.0 | 11.1893 | 0 | [352, 376] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_133444__708.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5625 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_094902__879 | 1 | 0.0 | 11.8357 | 0 | [349, 399] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_094902__879.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5626 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_133359__271 | 0 | 0.0 | 8.03622 | 0 | [349, 262] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_133359__271.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5627 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_133405__498 | 0 | 0.0 | 6.15165 | 0 | [349, 192] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_133405__498.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5628 | Apple-MacBook-Pro-M1 | wrap_string | dolphin-phi:2.7b-v2.6-q6_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_133413__604 | 0 | 0.0 | 8.00994 | 0 | [349, 261] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/dolphin-phi:2.7b-v2.6-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_133413__604.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5629 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | InJulia | 1SHOT | true | false | 6 | 20231214_090003__841 | 0 | 0.0 | 15.5205 | 0 | [65, 461] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__InJulia__1SHOT__20231214_090003__841.json | 25.0 | missing | missing | missing | |
| 5630 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | InJulia | 1SHOT | true | false | 6 | 20231225_071219__523 | 0 | 0.0 | 13.329 | 0 | [65, 394] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__InJulia__1SHOT__20231225_071219__523.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5631 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | InJulia | 1SHOT | false | false | 6 | 20231225_071233__485 | 0 | 0.0 | 13.7704 | 0 | [1, 422] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__InJulia__1SHOT__20231225_071233__485.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5632 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | InJulia | 1SHOT | true | false | 6 | 20231227_092002__780 | 0 | 0.0 | 12.5024 | 0 | [65, 380] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__InJulia__1SHOT__20231227_092002__780.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5633 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231214_085947__426 | 0 | 0.0 | 8.25063 | 2 | [94, 241] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertAsk__1SHOT__20231214_085947__426.json | 75.0 | missing | missing | missing | |
| 5634 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231225_071158__800 | 0 | 0.0 | 9.65799 | 0 | [94, 280] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_071158__800.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5635 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231225_071206__292 | 0 | 0.0 | 7.50202 | 0 | [1, 234] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertAsk__1SHOT__20231225_071206__292.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5636 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231227_091950__997 | 0 | 0.0 | 12.9728 | 0 | [94, 386] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertAsk__1SHOT__20231227_091950__997.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5637 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231214_085939__134 | 0 | 0.0 | 15.0628 | 0 | [175, 408] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_085939__134.json | 25.0 | missing | missing | missing | |
| 5638 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_071127__609 | 0 | 0.0 | 24.4218 | 0 | [193, 420] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071127__609.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5639 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_071148__175 | 0 | 0.0 | 19.1673 | 0 | [1, 548] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071148__175.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5640 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_091936__904 | 0 | 0.0 | 22.599 | 0 | [193, 441] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_091936__904.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5641 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231214_090108__418 | 0 | 0.0 | 27.0922 | 0 | [11, 718] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090108__418.json | 50.0 | missing | missing | missing | |
| 5642 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_071354__776 | 0 | 0.0 | 20.4252 | 0 | [11, 549] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071354__776.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5643 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_071414__243 | 0 | 0.0 | 19.9276 | 0 | [1, 542] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071414__243.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5644 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_092032__994 | 0 | 0.0 | 2.41351 | 0 | [11, 66] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092032__994.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5645 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapTask | 1SHOT | false | false | 6 | 20231214_090041__682 | 0 | 0.0 | 23.256 | 0 | [365, 548] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapTask__1SHOT__20231214_090041__682.json | 0.0 | missing | missing | missing | |
| 5646 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_071311__274 | 0 | 0.0 | 15.3346 | 0 | [365, 337] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_071311__274.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5647 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapTask | 1SHOT | true | false | 6 | 20231225_071334__658 | 0 | 0.0 | 22.4462 | 0 | [1, 605] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapTask__1SHOT__20231225_071334__658.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5648 | Apple-MacBook-Pro-M1 | wrap_string | llama2 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_092030__850 | 2 | 0.0 | 27.6493 | 2 | [365, 666] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/llama2/evaluation__JuliaRecapTask__1SHOT__20231227_092030__850.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5649 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | InJulia | 1SHOT | true | true | 6 | 20231214_090941__541 | 0 | 0.0 | 17.3566 | 2 | [65, 512] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__InJulia__1SHOT__20231214_090941__541.json | 75.0 | missing | missing | missing | |
| 5650 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | InJulia | 1SHOT | true | true | 6 | 20231225_073555__338 | 0 | 0.0 | 9.56099 | 1 | [65, 304] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__InJulia__1SHOT__20231225_073555__338.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5651 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | InJulia | 1SHOT | true | true | 6 | 20231225_073603__926 | 0 | 0.0 | 7.14085 | 1 | [65, 225] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__InJulia__1SHOT__20231225_073603__926.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5652 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | InJulia | 1SHOT | true | true | 6 | 20231227_092826__312 | 3 | 0.0 | 12.358 | 2 | [65, 406] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__InJulia__1SHOT__20231227_092826__312.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5653 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231214_090923__979 | 0 | 0.0 | 9.88133 | 0 | [94, 290] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231214_090923__979.json | 25.0 | missing | missing | missing | |
| 5654 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073535__606 | 0 | 0.0 | 7.82167 | 1 | [104, 243] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_073535__606.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5655 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073545__999 | 0 | 0.0 | 10.0075 | 0 | [104, 315] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231225_073545__999.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5656 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_092814__986 | 0 | 0.0 | 9.81063 | 0 | [104, 315] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertAsk__1SHOT__20231227_092814__986.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5657 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231214_090913__635 | 0 | 0.0 | 20.4157 | 0 | [175, 555] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090913__635.json | 0.0 | missing | missing | missing | |
| 5658 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_073518__766 | 0 | 0.0 | 16.0027 | 1 | [185, 289] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073518__766.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5659 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_073527__936 | 1 | 0.0 | 9.475 | 2 | [185, 283] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073527__936.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5660 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_092804__440 | 6 | 0.0 | 15.4852 | 2 | [185, 299] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092804__440.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5661 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_091039__113 | 0 | 0.0 | 20.7042 | 0 | [11, 560] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231214_091039__113.json | 0.0 | missing | missing | missing | |
| 5662 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_073650__984 | 0 | 0.0 | 13.2226 | 1 | [368, 368] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073650__984.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5663 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_073701__632 | 4 | 0.0 | 10.8874 | 2 | [368, 299] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073701__632.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5664 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_092850__886 | 4 | 0.0 | 10.8554 | 2 | [368, 301] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092850__886.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5665 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapTask | 1SHOT | false | false | 6 | 20231214_091018__845 | 0 | 0.0 | 20.3092 | 0 | [365, 472] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapTask__1SHOT__20231214_091018__845.json | 0.0 | missing | missing | missing | |
| 5666 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_073627__132 | 0 | 0.0 | 13.0086 | 1 | [365, 362] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_073627__132.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5667 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_073636__922 | 3 | 0.0 | 8.96252 | 2 | [365, 236] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapTask__1SHOT__20231225_073636__922.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5668 | Apple-MacBook-Pro-M1 | wrap_string | magicoder | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_092839__652 | 0 | 0.0 | 12.9749 | 0 | [365, 368] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder/evaluation__JuliaRecapTask__1SHOT__20231227_092839__652.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5669 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 6 | 20231227_183434__412 | 0 | 0.0 | 16.037 | 1 | [65, 311] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183434__412.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5670 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 6 | 20231227_183454__858 | 0 | 0.0 | 20.4749 | 1 | [65, 398] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183454__858.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5671 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | InJulia | 1SHOT | true | true | 6 | 20231227_183511__140 | 4 | 0.0 | 16.1547 | 2 | [65, 313] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__InJulia__1SHOT__20231227_183511__140.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5672 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_183349__811 | 0 | 0.0 | 11.1204 | 0 | [104, 209] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183349__811.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5673 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_183403__115 | 6 | 0.0 | 14.0422 | 2 | [104, 267] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183403__115.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5674 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_183418__393 | 6 | 0.0 | 14.6978 | 2 | [104, 280] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_183418__393.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5675 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_183310__968 | 3 | 0.0 | 13.9598 | 2 | [185, 257] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_183310__968.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5676 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_183321__211 | 3 | 0.0 | 10.7731 | 2 | [185, 194] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_183321__211.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5677 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_183337__327 | 3 | 0.0 | 16.4286 | 2 | [185, 305] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_183337__327.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5678 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_183624__576 | 3 | 0.0 | 21.6535 | 2 | [368, 380] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183624__576.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5679 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_183640__243 | 4 | 0.0 | 16.8115 | 2 | [368, 288] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183640__243.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5680 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_183659__967 | 4 | 0.0 | 18.2868 | 2 | [368, 316] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_183659__967.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5681 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_183532__404 | 2 | 0.0 | 21.0304 | 2 | [365, 368] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183532__404.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5682 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_183551__213 | 4 | 0.0 | 18.7041 | 2 | [365, 324] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183551__213.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5683 | Apple-MacBook-Pro-M1 | wrap_string | magicoder:7b-s-cl-q6_K | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_183602__995 | 0 | 0.0 | 11.0379 | 2 | [365, 177] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/magicoder:7b-s-cl-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_183602__995.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5684 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | false | 6 | 20231225_080950__643 | 0 | 0.0 | 12.1903 | 0 | [61, 311] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_080950__643.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5685 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_080959__160 | 2 | 0.0 | 8.37088 | 2 | [61, 211] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231225_080959__160.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5686 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_094353__655 | 4 | 0.0 | 9.59407 | 2 | [61, 242] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__InJulia__1SHOT__20231227_094353__655.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5687 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231225_080928__995 | 0 | 0.0 | 7.08154 | 0 | [102, 169] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_080928__995.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5688 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_080938__546 | 2 | 0.0 | 9.88958 | 2 | [102, 242] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_080938__546.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5689 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231227_094343__550 | 0 | 0.0 | 8.30971 | 0 | [102, 200] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_094343__550.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5690 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_080909__256 | 0 | 0.0 | 23.7638 | 1 | [183, 428] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_080909__256.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5691 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_080921__505 | 0 | 0.0 | 11.5366 | 0 | [183, 273] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_080921__505.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5692 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_094334__433 | 0 | 0.0 | 24.4784 | 2 | [183, 463] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094334__433.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5693 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_081045__815 | 0 | 0.0 | 15.3386 | 0 | [369, 338] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081045__815.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5694 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_081103__252 | 0 | 0.0 | 18.1685 | 0 | [369, 408] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081103__252.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5695 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_094423__546 | 0 | 0.0 | 13.9783 | 0 | [369, 303] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094423__546.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5696 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 6 | 20231225_081013__139 | 0 | 0.0 | 10.0488 | 0 | [367, 206] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_081013__139.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5697 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_081029__194 | 3 | 0.0 | 16.4772 | 2 | [367, 366] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_081029__194.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5698 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 6 | 20231227_094409__558 | 0 | 0.0 | 16.3317 | 0 | [367, 361] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_094409__558.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5699 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 6 | 20231228_005246__170 | 0 | 0.0 | 9.95555 | 0 | [60, 319] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_005246__170.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5700 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 6 | 20231228_005303__614 | 2 | 0.0 | 16.3505 | 0 | [60, 523] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_005303__614.json | 58.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5701 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 6 | 20231228_005315__108 | 0 | 0.0 | 11.6737 | 2 | [60, 374] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_005315__108.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5702 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | true | 6 | 20231228_005326__472 | 0 | 0.0 | 11.0985 | 1 | [60, 356] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_005326__472.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5703 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | InJulia | 1SHOT | true | false | 6 | 20231228_005338__974 | 0 | 0.0 | 11.261 | 0 | [60, 361] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__InJulia__1SHOT__20231228_005338__974.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5704 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005201__607 | 0 | 0.0 | 7.52045 | 0 | [101, 229] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_005201__607.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5705 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231228_005207__272 | 0 | 0.0 | 5.98151 | 0 | [101, 179] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_005207__272.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5706 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005220__472 | 0 | 0.0 | 12.8179 | 0 | [101, 401] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_005220__472.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5707 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005227__487 | 0 | 0.0 | 7.3174 | 1 | [101, 223] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_005227__487.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5708 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005236__658 | 0 | 0.0 | 8.75651 | 0 | [101, 269] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertAsk__1SHOT__20231228_005236__658.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5709 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231228_005102__808 | 0 | 0.0 | 11.3916 | 0 | [182, 310] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005102__808.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5710 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005119__558 | 0 | 0.0 | 16.5265 | 0 | [182, 501] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005119__558.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5711 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005131__999 | 0 | 0.0 | 11.8957 | 1 | [182, 358] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005131__999.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5712 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005146__785 | 0 | 0.0 | 14.9141 | 1 | [182, 453] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005146__785.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5713 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005154__871 | 0 | 0.0 | 7.58889 | 1 | [182, 220] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005154__871.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5714 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_005511__904 | 0 | 0.0 | 12.6459 | 1 | [368, 345] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005511__904.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5715 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_005521__630 | 0 | 0.0 | 9.66916 | 0 | [368, 253] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005521__630.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5716 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_005537__699 | 0 | 0.0 | 16.2673 | 0 | [368, 456] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005537__699.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5717 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_005553__172 | 0 | 0.0 | 15.9933 | 0 | [368, 448] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005553__172.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5718 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_005609__437 | 2 | 0.0 | 15.3899 | 2 | [368, 430] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapCoTTask__1SHOT__20231228_005609__437.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5719 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_005354__863 | 0 | 0.0 | 16.5508 | 1 | [366, 465] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_005354__863.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5720 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 6 | 20231228_005408__622 | 0 | 0.0 | 13.9441 | 0 | [366, 386] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_005408__622.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5721 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_005423__246 | 0 | 0.0 | 14.4643 | 1 | [366, 402] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_005423__246.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5722 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | false | 6 | 20231228_005441__352 | 0 | 0.0 | 18.2581 | 0 | [366, 517] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_005441__352.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5723 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_0 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_005458__632 | 0 | 0.0 | 16.9208 | 1 | [366, 476] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_0/evaluation__JuliaRecapTask__1SHOT__20231228_005458__632.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5724 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | false | false | 6 | 20231228_005845__660 | 0 | 0.0 | 19.2672 | 0 | [60, 487] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_005845__660.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5725 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231228_005900__459 | 0 | 0.0 | 14.7723 | 1 | [60, 374] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_005900__459.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5726 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231228_005917__705 | 0 | 0.0 | 16.7526 | 1 | [60, 424] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_005917__705.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5727 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231228_005940__744 | 2 | 0.0 | 23.7066 | 2 | [60, 598] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_005940__744.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5728 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231228_005954__600 | 0 | 0.0 | 13.8082 | 1 | [60, 349] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__InJulia__1SHOT__20231228_005954__600.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5729 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231228_005745__278 | 0 | 0.0 | 11.7319 | 0 | [101, 287] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_005745__278.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5730 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005754__197 | 0 | 0.0 | 9.85606 | 1 | [101, 239] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_005754__197.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5731 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231228_005808__103 | 0 | 0.0 | 13.2707 | 0 | [101, 326] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_005808__103.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5732 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005818__745 | 0 | 0.0 | 10.092 | 1 | [101, 245] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_005818__745.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5733 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231228_005825__708 | 0 | 0.0 | 7.34317 | 1 | [101, 174] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231228_005825__708.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5734 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005632__201 | 0 | 0.0 | 22.6391 | 0 | [182, 518] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005632__201.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5735 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231228_005643__897 | 0 | 0.0 | 11.2898 | 0 | [182, 265] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005643__897.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5736 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231228_005657__419 | 0 | 0.0 | 14.5415 | 0 | [182, 347] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005657__419.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5737 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231228_005717__832 | 0 | 0.0 | 19.3888 | 0 | [182, 468] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005717__832.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5738 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231228_005733__389 | 0 | 0.0 | 15.9033 | 0 | [182, 381] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231228_005733__389.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5739 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231228_010144__794 | 0 | 0.0 | 18.7775 | 0 | [368, 419] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_010144__794.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5740 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_010204__997 | 0 | 0.0 | 20.5294 | 1 | [368, 462] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_010204__997.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5741 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_010224__595 | 3 | 0.0 | 19.9134 | 1 | [368, 447] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_010224__595.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5742 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_010242__713 | 0 | 0.0 | 17.8266 | 0 | [368, 396] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_010242__713.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5743 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231228_010300__489 | 0 | 0.0 | 17.3432 | 1 | [368, 384] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231228_010300__489.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5744 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_010010__506 | 0 | 0.0 | 15.4681 | 1 | [366, 338] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_010010__506.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5745 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_010032__228 | 0 | 0.0 | 21.862 | 1 | [366, 494] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_010032__228.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5746 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 6 | 20231228_010051__442 | 0 | 0.0 | 19.345 | 0 | [366, 433] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_010051__442.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5747 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_010111__893 | 1 | 0.0 | 19.6223 | 2 | [366, 440] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_010111__893.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5748 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231228_010125__817 | 2 | 0.0 | 13.5999 | 2 | [366, 292] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231228_010125__817.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5749 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 6 | 20231226_124611__548 | 0 | 0.0 | 21.5196 | 1 | [60, 380] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_124611__548.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5750 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 6 | 20231226_124635__587 | 1 | 0.0 | 24.1213 | 2 | [60, 437] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231226_124635__587.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5751 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | InJulia | 1SHOT | true | true | 6 | 20231227_094735__663 | 0 | 0.0 | 25.5407 | 1 | [60, 473] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__InJulia__1SHOT__20231227_094735__663.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5752 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231226_124531__662 | 0 | 0.0 | 16.1485 | 1 | [101, 289] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_124531__662.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5753 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231226_124549__932 | 0 | 0.0 | 18.0881 | 0 | [101, 316] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231226_124549__932.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5754 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_094709__109 | 0 | 0.0 | 13.2257 | 1 | [101, 237] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_094709__109.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5755 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231226_124450__591 | 0 | 0.0 | 18.7843 | 0 | [182, 328] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_124450__591.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5756 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231226_124515__123 | 0 | 0.0 | 24.8781 | 1 | [182, 432] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231226_124515__123.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5757 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231227_094656__898 | 0 | 0.0 | 29.7509 | 0 | [182, 369] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094656__898.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5758 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231226_124828__348 | 0 | 0.0 | 21.3832 | 1 | [368, 354] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124828__348.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5759 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231226_124905__901 | 0 | 0.0 | 36.906 | 1 | [368, 628] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231226_124905__901.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5760 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_094821__168 | 2 | 0.0 | 22.1892 | 2 | [368, 370] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094821__168.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5761 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 6 | 20231226_124740__343 | 0 | 0.0 | 25.5504 | 0 | [366, 418] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_124740__343.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5762 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 6 | 20231226_124806__335 | 0 | 0.0 | 26.3005 | 0 | [366, 439] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231226_124806__335.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5763 | Apple-MacBook-Pro-M1 | wrap_string | mistral:7b-instruct-v0.2-q6_K | JuliaRecapTask | 1SHOT | true | false | 6 | 20231227_094759__582 | 0 | 0.0 | 24.1818 | 0 | [366, 406] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/mistral:7b-instruct-v0.2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_094759__582.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5764 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_095221__951 | 0 | 0.0 | 58.0239 | 0 | [65, 343] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_095221__951.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5765 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | false | false | 6 | 20231227_134039__389 | 0 | 0.0 | 28.3166 | 0 | [65, 162] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_134039__389.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5766 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_134121__596 | 0 | 0.0 | 41.4799 | 1 | [65, 243] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_134121__596.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5767 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_134157__188 | 0 | 0.0 | 36.6099 | 1 | [65, 213] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__InJulia__1SHOT__20231227_134157__188.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5768 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_095123__548 | 0 | 0.0 | 50.2635 | 1 | [104, 290] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_095123__548.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5769 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_133815__467 | 0 | 0.0 | 26.5751 | 0 | [104, 146] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_133815__467.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5770 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_133902__672 | 0 | 0.0 | 46.8198 | 0 | [104, 270] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_133902__672.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5771 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_134011__752 | 0 | 0.0 | 68.704 | 0 | [104, 402] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_134011__752.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5772 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_095033__822 | 0 | 0.0 | 77.5115 | 1 | [184, 298] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_095033__822.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5773 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_133550__919 | 2 | 0.0 | 65.2941 | 2 | [184, 343] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133550__919.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5774 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_133655__938 | 0 | 0.0 | 65.4788 | 1 | [184, 370] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133655__938.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5775 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_133748__319 | 0 | 0.0 | 52.7643 | 1 | [184, 294] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_133748__319.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5776 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_095447__646 | 0 | 0.0 | 65.7757 | 1 | [378, 335] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_095447__646.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5777 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_134626__852 | 2 | 0.0 | 45.0787 | 2 | [378, 212] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_134626__852.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5778 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231227_140032__982 | 0 | 0.0 | 20.2129 | 0 | [378, 5] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_140032__982.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5779 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_140259__869 | 0 | 0.0 | 146.594 | 1 | [378, 769] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_140259__869.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5780 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_095341__346 | 0 | 0.0 | 79.8964 | 0 | [376, 418] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_095341__346.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5781 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | false | 6 | 20231227_134300__781 | 0 | 0.0 | 61.9905 | 0 | [376, 314] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_134300__781.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5782 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_134432__765 | 0 | 0.0 | 92.8684 | 0 | [376, 494] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_134432__765.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5783 | Apple-MacBook-Pro-M1 | wrap_string | nous-hermes2:34b-yi-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_134540__546 | 4 | 0.0 | 67.3954 | 2 | [376, 344] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/nous-hermes2:34b-yi-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_134540__546.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5784 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_081227__255 | 4 | 0.0 | 11.6657 | 2 | [69, 293] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_081227__255.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5785 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_081237__394 | 3 | 0.0 | 10.0398 | 2 | [69, 251] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231225_081237__394.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5786 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_094502__526 | 0 | 0.0 | 13.0414 | 0 | [69, 325] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__InJulia__1SHOT__20231227_094502__526.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5787 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_081201__330 | 0 | 0.0 | 12.0488 | 0 | [110, 297] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_081201__330.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5788 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_081215__781 | 2 | 0.0 | 13.6557 | 2 | [110, 338] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_081215__781.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5789 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_094449__279 | 0 | 0.0 | 11.2717 | 1 | [110, 262] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_094449__279.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5790 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231225_081134__640 | 0 | 0.0 | 30.2001 | 0 | [191, 581] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_081134__640.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5791 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_081149__592 | 1 | 0.0 | 15.221 | 2 | [191, 366] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_081149__592.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5792 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_094437__329 | 5 | 0.0 | 14.097 | 2 | [191, 183] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094437__329.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5793 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_081343__915 | 3 | 0.0 | 17.2608 | 2 | [377, 385] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081343__915.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5794 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_081358__471 | 4 | 0.0 | 14.8113 | 2 | [377, 324] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081358__471.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5795 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_094526__441 | 4 | 0.0 | 11.8538 | 2 | [377, 249] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094526__441.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5796 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_081311__266 | 1 | 0.0 | 13.2143 | 0 | [375, 285] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_081311__266.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5797 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_081326__994 | 5 | 0.0 | 14.8795 | 2 | [375, 326] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_081326__994.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5798 | Apple-MacBook-Pro-M1 | wrap_string | openchat:7b-v3.5-1210-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_094514__439 | 3 | 0.0 | 12.2026 | 2 | [375, 257] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openchat:7b-v3.5-1210-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_094514__439.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5799 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | InJulia | 1SHOT | true | false | 6 | 20231214_090148__772 | 0 | 0.0 | 20.1072 | 0 | [65, 589] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231214_090148__772.json | 25.0 | missing | missing | missing | |
| 5800 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 6 | 20231225_071514__817 | 2 | 0.0 | 12.2569 | 2 | [67, 384] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_071514__817.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5801 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 6 | 20231225_071524__219 | 4 | 0.0 | 9.71242 | 1 | [67, 304] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231225_071524__219.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5802 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | InJulia | 1SHOT | true | true | 6 | 20231227_092121__914 | 0 | 0.0 | 19.2598 | 0 | [67, 618] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__InJulia__1SHOT__20231227_092121__914.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5803 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231214_090128__597 | 0 | 0.0 | 6.77809 | 0 | [94, 195] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231214_090128__597.json | 25.0 | missing | missing | missing | |
| 5804 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_071451__346 | 0 | 0.0 | 9.77129 | 2 | [108, 299] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_071451__346.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5805 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_071502__475 | 1 | 0.0 | 11.3287 | 2 | [108, 349] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231225_071502__475.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5806 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_092102__597 | 2 | 0.0 | 10.1531 | 2 | [108, 320] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertAsk__1SHOT__20231227_092102__597.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5807 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231214_090121__677 | 0 | 0.0 | 12.2 | 0 | [175, 328] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090121__677.json | 0.0 | missing | missing | missing | |
| 5808 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231225_071430__185 | 0 | 0.0 | 15.8392 | 0 | [189, 308] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071430__185.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5809 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_071441__417 | 1 | 0.0 | 10.3551 | 2 | [189, 306] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071441__417.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5810 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_092051__898 | 0 | 0.0 | 19.0355 | 0 | [189, 434] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092051__898.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5811 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_090220__255 | 0 | 0.0 | 1.49227 | 0 | [11, 38] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090220__255.json | 0.0 | missing | missing | missing | |
| 5812 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_071646__637 | 5 | 0.0 | 20.0825 | 2 | [375, 565] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071646__637.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5813 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_071657__106 | 0 | 0.0 | 10.5812 | 0 | [375, 279] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071657__106.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5814 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_092148__688 | 0 | 0.0 | 13.664 | 2 | [375, 383] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092148__688.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5815 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | false | false | 6 | 20231214_090219__819 | 0 | 0.0 | 21.0518 | 0 | [365, 491] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231214_090219__819.json | 0.0 | missing | missing | missing | |
| 5816 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_071607__834 | 0 | 0.0 | 16.6305 | 1 | [373, 462] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_071607__834.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5817 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_071626__131 | 0 | 0.0 | 19.0875 | 2 | [373, 537] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231225_071626__131.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5818 | Apple-MacBook-Pro-M1 | wrap_string | openhermes2.5-mistral | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_092135__458 | 2 | 0.0 | 13.3705 | 2 | [373, 374] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/openhermes2.5-mistral/evaluation__JuliaRecapTask__1SHOT__20231227_092135__458.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5819 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | InJulia | 1SHOT | true | false | 6 | 20231214_091243__882 | 0 | 0.0 | 15.5653 | 0 | [65, 462] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__InJulia__1SHOT__20231214_091243__882.json | 25.0 | missing | missing | missing | |
| 5820 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | InJulia | 1SHOT | false | false | 6 | 20231225_073950__106 | 0 | 0.0 | 3.41398 | 0 | [68, 50] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__InJulia__1SHOT__20231225_073950__106.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5821 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | InJulia | 1SHOT | true | true | 6 | 20231225_074008__138 | 6 | 0.0 | 18.4266 | 2 | [68, 331] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__InJulia__1SHOT__20231225_074008__138.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5822 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | InJulia | 1SHOT | true | true | 6 | 20231227_093025__219 | 0 | 0.0 | 23.1359 | 1 | [68, 422] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__InJulia__1SHOT__20231227_093025__219.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5823 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231214_091227__288 | 0 | 0.0 | 7.28997 | 0 | [94, 211] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231214_091227__288.json | 50.0 | missing | missing | missing | |
| 5824 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073930__197 | 1 | 0.0 | 21.6025 | 2 | [107, 381] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_073930__197.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5825 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_073946__761 | 0 | 0.0 | 16.6054 | 0 | [107, 292] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231225_073946__761.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5826 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_093002__199 | 0 | 0.0 | 13.8947 | 0 | [107, 245] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertAsk__1SHOT__20231227_093002__199.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5827 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231214_091220__566 | 0 | 0.0 | 12.3513 | 0 | [175, 332] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231214_091220__566.json | 25.0 | missing | missing | missing | |
| 5828 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_073848__120 | 0 | 0.0 | 26.7905 | 0 | [188, 285] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073848__120.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5829 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_073908__684 | 0 | 0.0 | 19.6467 | 0 | [188, 331] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073908__684.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5830 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_092948__469 | 0 | 0.0 | 21.7038 | 0 | [188, 200] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092948__469.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5831 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_091351__558 | 0 | 0.0 | 14.0463 | 0 | [11, 389] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231214_091351__558.json | 0.0 | missing | missing | missing | |
| 5832 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_074159__135 | 6 | 0.0 | 36.4683 | 2 | [371, 587] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_074159__135.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5833 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_074205__740 | 0 | 0.0 | 5.64447 | 0 | [371, 47] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231225_074205__740.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5834 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_093134__692 | 0 | 0.0 | 38.4027 | 2 | [371, 615] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapCoTTask__1SHOT__20231227_093134__692.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5835 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapTask | 1SHOT | true | false | 6 | 20231214_091337__234 | 0 | 0.0 | 30.0525 | 0 | [365, 715] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231214_091337__234.json | 25.0 | missing | missing | missing | |
| 5836 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_074102__245 | 0 | 0.0 | 5.59621 | 0 | [368, 46] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_074102__245.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5837 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_074123__673 | 0 | 0.0 | 20.9842 | 0 | [368, 323] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231225_074123__673.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5838 | Apple-MacBook-Pro-M1 | wrap_string | orca2:13b | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_093056__211 | 1 | 0.0 | 29.997 | 2 | [368, 482] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/orca2:13b/evaluation__JuliaRecapTask__1SHOT__20231227_093056__211.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5839 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 6 | 20231225_081508__278 | 0 | 0.0 | 5.28398 | 0 | [57, 208] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_081508__278.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5840 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 6 | 20231225_081515__110 | 0 | 0.0 | 7.24517 | 0 | [57, 285] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231225_081515__110.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5841 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | InJulia | 1SHOT | false | false | 6 | 20231227_094559__602 | 0 | 0.0 | 7.972 | 0 | [57, 310] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__InJulia__1SHOT__20231227_094559__602.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5842 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_081432__144 | 0 | 0.0 | 19.3651 | 0 | [94, 729] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_081432__144.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5843 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_081502__390 | 0 | 0.0 | 29.8562 | 0 | [94, 1085] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231225_081502__390.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5844 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231227_094551__995 | 0 | 0.0 | 15.2566 | 0 | [94, 575] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertAsk__1SHOT__20231227_094551__995.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5845 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_081409__997 | 0 | 0.0 | 10.5271 | 0 | [173, 248] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_081409__997.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5846 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_081413__561 | 0 | 0.0 | 4.53324 | 0 | [173, 159] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231225_081413__561.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5847 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_094536__883 | 0 | 0.0 | 9.37842 | 0 | [173, 210] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094536__883.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5848 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_081634__229 | 0 | 0.0 | 11.1142 | 0 | [346, 376] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081634__229.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5849 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_081635__233 | 0 | 0.0 | 1.11885 | 0 | [346, 1] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231225_081635__233.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5850 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20231227_094626__325 | 0 | 0.0 | 15.5198 | 0 | [346, 529] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094626__325.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5851 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | true | false | 6 | 20231225_081559__358 | 0 | 0.0 | 10.8354 | 0 | [343, 366] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_081559__358.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5852 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_081623__955 | 0 | 0.0 | 23.6878 | 0 | [343, 811] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231225_081623__955.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5853 | Apple-MacBook-Pro-M1 | wrap_string | phi:2.7b-chat-v2-q6_K | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_094610__584 | 0 | 0.0 | 11.482 | 0 | [343, 387] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phi:2.7b-chat-v2-q6_K/evaluation__JuliaRecapTask__1SHOT__20231227_094610__584.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5854 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | InJulia | 1SHOT | true | false | 6 | 20231214_091429__429 | 0 | 0.0 | 13.757 | 0 | [65, 410] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231214_091429__429.json | 25.0 | missing | missing | missing | |
| 5855 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 6 | 20231225_074546__616 | 1 | 0.0 | 39.5885 | 2 | [76, 301] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_074546__616.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5856 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 6 | 20231225_074624__550 | 2 | 0.0 | 38.3352 | 2 | [76, 290] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231225_074624__550.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5857 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | InJulia | 1SHOT | true | true | 6 | 20231227_093338__884 | 3 | 0.0 | 36.5689 | 2 | [76, 282] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__InJulia__1SHOT__20231227_093338__884.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5858 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231214_091415__431 | 0 | 0.0 | 10.4333 | 0 | [94, 306] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231214_091415__431.json | 50.0 | missing | missing | missing | |
| 5859 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_074419__954 | 6 | 0.0 | 36.1501 | 2 | [115, 270] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_074419__954.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5860 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_074506__800 | 5 | 0.0 | 46.2149 | 2 | [115, 350] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231225_074506__800.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5861 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_093302__426 | 3 | 0.0 | 30.6901 | 2 | [115, 229] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertAsk__1SHOT__20231227_093302__426.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5862 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231214_091404__148 | 0 | 0.0 | 13.2991 | 0 | [175, 359] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231214_091404__148.json | 50.0 | missing | missing | missing | |
| 5863 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_074254__420 | 0 | 0.0 | 49.1583 | 1 | [196, 171] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_074254__420.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5864 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_074343__397 | 6 | 0.0 | 48.4383 | 2 | [196, 341] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231225_074343__397.json | 100.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5865 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231227_093231__460 | 5 | 0.0 | 56.6044 | 2 | [196, 249] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaExpertCoTTask__1SHOT__20231227_093231__460.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5866 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231214_091524__520 | 0 | 0.0 | 18.3141 | 2 | [11, 500] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231214_091524__520.json | 75.0 | missing | missing | missing | |
| 5867 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_075003__536 | 3 | 0.0 | 39.0995 | 2 | [379, 241] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_075003__536.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5868 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_075049__767 | 5 | 0.0 | 45.401 | 2 | [379, 289] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231225_075049__767.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5869 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_093459__953 | 5 | 0.0 | 40.2278 | 2 | [379, 258] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapCoTTask__1SHOT__20231227_093459__953.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5870 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | false | false | 6 | 20231214_091506__206 | 0 | 0.0 | 23.3595 | 0 | [365, 550] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231214_091506__206.json | 0.0 | missing | missing | missing | |
| 5871 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_074822__974 | 3 | 0.0 | 40.0888 | 2 | [376, 249] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_074822__974.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5872 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_074923__531 | 5 | 0.0 | 61.4046 | 2 | [376, 412] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231225_074923__531.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5873 | Apple-MacBook-Pro-M1 | wrap_string | phind-codellama:34b-v2 | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_093418__751 | 2 | 0.0 | 39.8572 | 2 | [376, 256] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/phind-codellama:34b-v2/evaluation__JuliaRecapTask__1SHOT__20231227_093418__751.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5874 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231225_080618__303 | 2 | 0.0 | 14.1745 | 2 | [69, 238] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_080618__303.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5875 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | false | 6 | 20231225_080640__478 | 0 | 0.0 | 22.2335 | 0 | [69, 378] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231225_080640__478.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5876 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | InJulia | 1SHOT | true | true | 6 | 20231227_094228__570 | 3 | 0.0 | 27.734 | 2 | [69, 471] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__InJulia__1SHOT__20231227_094228__570.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5877 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | false | false | 6 | 20231225_080545__819 | 0 | 0.0 | 19.6535 | 0 | [110, 327] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_080545__819.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5878 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_080603__866 | 0 | 0.0 | 17.8643 | 0 | [110, 296] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231225_080603__866.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5879 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231227_094200__132 | 0 | 0.0 | 19.7321 | 0 | [110, 328] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertAsk__1SHOT__20231227_094200__132.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5880 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_080511__583 | 0 | 0.0 | 31.3271 | 0 | [191, 315] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_080511__583.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5881 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_080525__143 | 0 | 0.0 | 13.1628 | 0 | [191, 200] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231225_080525__143.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5882 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_094140__714 | 0 | 0.0 | 27.9301 | 0 | [191, 306] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaExpertCoTTask__1SHOT__20231227_094140__714.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5883 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_080828__194 | 0 | 0.0 | 20.2183 | 2 | [377, 295] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_080828__194.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5884 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20231225_080845__173 | 0 | 0.0 | 17.0302 | 0 | [377, 241] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231225_080845__173.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5885 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_094310__362 | 0 | 0.0 | 22.5646 | 0 | [377, 333] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapCoTTask__1SHOT__20231227_094310__362.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5886 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_080745__918 | 0 | 0.0 | 28.1364 | 0 | [375, 426] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_080745__918.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5887 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_080808__159 | 0 | 0.0 | 22.3829 | 0 | [375, 331] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231225_080808__159.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5888 | Apple-MacBook-Pro-M1 | wrap_string | solar:10.7b-instruct-v1-q4_K_M | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_094247__535 | 0 | 0.0 | 19.3206 | 1 | [375, 279] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/solar:10.7b-instruct-v1-q4_K_M/evaluation__JuliaRecapTask__1SHOT__20231227_094247__535.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5889 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | InJulia | 1SHOT | false | false | 6 | 20231214_091121__502 | 0 | 0.0 | 16.0864 | 0 | [65, 477] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__InJulia__1SHOT__20231214_091121__502.json | 0.0 | missing | missing | missing | |
| 5890 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | InJulia | 1SHOT | true | false | 6 | 20231225_073733__653 | 0 | 0.0 | 5.00542 | 0 | [70, 282] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_073733__653.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5891 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | InJulia | 1SHOT | true | false | 6 | 20231225_073741__856 | 0 | 0.0 | 7.45959 | 0 | [70, 415] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__InJulia__1SHOT__20231225_073741__856.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5892 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | InJulia | 1SHOT | true | false | 6 | 20231227_092915__423 | 0 | 0.0 | 9.95984 | 0 | [70, 554] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__InJulia__1SHOT__20231227_092915__423.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5893 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231214_091105__300 | 0 | 0.0 | 12.3637 | 0 | [94, 361] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231214_091105__300.json | 25.0 | missing | missing | missing | |
| 5894 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231225_073723__375 | 0 | 0.0 | 7.03885 | 0 | [107, 386] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_073723__375.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5895 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_073728__709 | 1 | 0.0 | 5.11838 | 0 | [107, 281] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231225_073728__709.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5896 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231227_092905__222 | 0 | 0.0 | 7.44119 | 0 | [107, 412] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertAsk__1SHOT__20231227_092905__222.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5897 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231214_091052__561 | 0 | 0.0 | 13.525 | 0 | [175, 366] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231214_091052__561.json | 50.0 | missing | missing | missing | |
| 5898 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231225_073708__325 | 0 | 0.0 | 7.13576 | 0 | [184, 216] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073708__325.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5899 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231225_073716__959 | 0 | 0.0 | 8.27553 | 0 | [184, 434] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231225_073716__959.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5900 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231227_092857__746 | 0 | 0.0 | 6.95043 | 0 | [184, 225] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092857__746.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5901 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20231214_091208__381 | 0 | 0.0 | 14.857 | 0 | [11, 409] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231214_091208__381.json | 25.0 | missing | missing | missing | |
| 5902 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073817__638 | 0 | 0.0 | 6.37316 | 0 | [357, 285] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073817__638.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5903 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073822__531 | 0 | 0.0 | 4.51973 | 0 | [357, 190] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073822__531.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5904 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_092927__312 | 1 | 0.0 | 4.36126 | 0 | [357, 183] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092927__312.json | 54.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5905 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | true | 6 | 20231214_091153__603 | 0 | 0.0 | 20.0677 | 2 | [365, 466] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231214_091153__603.json | 75.0 | missing | missing | missing | |
| 5906 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapTask | 1SHOT | true | false | 6 | 20231225_073804__811 | 0 | 0.0 | 9.03437 | 0 | [355, 421] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_073804__811.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5907 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 6 | 20231225_073811__370 | 0 | 0.0 | 6.92399 | 0 | [355, 316] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231225_073811__370.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5908 | Apple-MacBook-Pro-M1 | wrap_string | stablelm-zephyr | JuliaRecapTask | 1SHOT | false | false | 6 | 20231227_092922__456 | 0 | 0.0 | 7.35213 | 0 | [355, 339] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/stablelm-zephyr/evaluation__JuliaRecapTask__1SHOT__20231227_092922__456.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5909 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | InJulia | 1SHOT | true | false | 6 | 20231214_090258__562 | 0 | 0.0 | 15.9127 | 0 | [65, 471] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__InJulia__1SHOT__20231214_090258__562.json | 25.0 | missing | missing | missing | |
| 5910 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | InJulia | 1SHOT | true | true | 6 | 20231225_071754__157 | 1 | 0.0 | 17.2633 | 2 | [69, 542] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_071754__157.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5911 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | InJulia | 1SHOT | true | true | 6 | 20231225_071806__528 | 0 | 0.0 | 11.1452 | 2 | [69, 349] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__InJulia__1SHOT__20231225_071806__528.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5912 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | InJulia | 1SHOT | true | true | 6 | 20231227_092224__211 | 1 | 0.0 | 11.9393 | 2 | [69, 384] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__InJulia__1SHOT__20231227_092224__211.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5913 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231214_090242__299 | 0 | 0.0 | 10.9448 | 0 | [94, 321] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231214_090242__299.json | 50.0 | missing | missing | missing | |
| 5914 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_071730__267 | 4 | 0.0 | 10.2044 | 2 | [110, 313] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_071730__267.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5915 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_071737__519 | 4 | 0.0 | 7.12749 | 2 | [110, 214] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231225_071737__519.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5916 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_092212__323 | 2 | 0.0 | 10.5519 | 2 | [110, 332] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertAsk__1SHOT__20231227_092212__323.json | 83.3333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5917 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231214_090231__677 | 0 | 0.0 | 11.0101 | 0 | [175, 294] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090231__677.json | 25.0 | missing | missing | missing | |
| 5918 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_071712__662 | 0 | 0.0 | 14.6829 | 0 | [191, 262] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071712__662.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5919 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_071719__573 | 0 | 0.0 | 7.10844 | 0 | [191, 202] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231225_071719__573.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5920 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaExpertCoTTask | 1SHOT | false | false | 6 | 20231227_092202__743 | 0 | 0.0 | 13.3321 | 0 | [191, 243] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092202__743.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5921 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231214_090358__292 | 0 | 0.0 | 22.5214 | 0 | [11, 606] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090358__292.json | 0.0 | missing | missing | missing | |
| 5922 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_071903__103 | 0 | 0.0 | 13.46 | 0 | [377, 366] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071903__103.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5923 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_071919__305 | 0 | 0.0 | 15.6301 | 0 | [377, 431] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231225_071919__305.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5924 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapCoTTask | 1SHOT | true | false | 6 | 20231227_092250__349 | 0 | 0.0 | 10.6243 | 0 | [377, 287] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092250__349.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5925 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 6 | 20231214_090336__713 | 0 | 0.0 | 24.5493 | 0 | [365, 580] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231214_090336__713.json | 50.0 | missing | missing | missing | |
| 5926 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_071839__494 | 0 | 0.0 | 11.8157 | 2 | [375, 316] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_071839__494.json | 75.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5927 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_071849__202 | 0 | 0.0 | 9.8079 | 0 | [375, 255] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231225_071849__202.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5928 | Apple-MacBook-Pro-M1 | wrap_string | starling-lm:latest | JuliaRecapTask | 1SHOT | true | true | 6 | 20231227_092240__441 | 3 | 0.0 | 15.0611 | 2 | [375, 426] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/starling-lm:latest/evaluation__JuliaRecapTask__1SHOT__20231227_092240__441.json | 87.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5929 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | InJulia | 1SHOT | true | false | 6 | 20231214_090449__830 | 0 | 0.0 | 13.2438 | 0 | [65, 395] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__InJulia__1SHOT__20231214_090449__830.json | 25.0 | missing | missing | missing | |
| 5930 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | InJulia | 1SHOT | true | true | 6 | 20231225_072251__372 | 0 | 0.0 | 60.5904 | 1 | [65, 447] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_072251__372.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5931 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | InJulia | 1SHOT | true | true | 6 | 20231225_072343__328 | 0 | 0.0 | 51.4461 | 1 | [65, 379] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__InJulia__1SHOT__20231225_072343__328.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5932 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | InJulia | 1SHOT | true | true | 6 | 20231227_092511__711 | 0 | 0.0 | 59.478 | 1 | [65, 449] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__InJulia__1SHOT__20231227_092511__711.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5933 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | false | 6 | 20231214_090436__213 | 0 | 0.0 | 9.96774 | 0 | [94, 291] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231214_090436__213.json | 25.0 | missing | missing | missing | |
| 5934 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_072121__116 | 4 | 0.0 | 36.9075 | 2 | [104, 263] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_072121__116.json | 91.6667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5935 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231225_072151__268 | 0 | 0.0 | 29.434 | 1 | [104, 206] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231225_072151__268.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5936 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertAsk | 1SHOT | true | true | 6 | 20231227_092411__512 | 1 | 0.0 | 29.0728 | 2 | [104, 208] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertAsk__1SHOT__20231227_092411__512.json | 79.1667 | missing | {\n "num_gpu": 99\n} | missing | |
| 5937 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231214_090426__402 | 1 | 0.0 | 27.5517 | 0 | [175, 739] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231214_090426__402.json | 54.1667 | missing | missing | missing | |
| 5938 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_072015__302 | 0 | 0.0 | 55.6793 | 1 | [184, 203] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_072015__302.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5939 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | true | 6 | 20231225_072044__105 | 5 | 0.0 | 28.3155 | 2 | [184, 186] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231225_072044__105.json | 95.8333 | missing | {\n "num_gpu": 99\n} | missing | |
| 5940 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaExpertCoTTask | 1SHOT | true | false | 6 | 20231227_092342__607 | 0 | 0.0 | 51.8043 | 0 | [184, 200] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaExpertCoTTask__1SHOT__20231227_092342__607.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5941 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231214_090547__681 | 0 | 0.0 | 20.2949 | 2 | [11, 549] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231214_090547__681.json | 75.0 | missing | missing | missing | |
| 5942 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231225_073013__766 | 0 | 0.0 | 81.5425 | 1 | [378, 528] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073013__766.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5943 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | false | false | 6 | 20231225_073023__255 | 0 | 0.0 | 9.98316 | 0 | [378, 15] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231225_073023__255.json | 0.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5944 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapCoTTask | 1SHOT | true | true | 6 | 20231227_092640__978 | 0 | 0.0 | 43.1922 | 0 | [378, 269] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapCoTTask__1SHOT__20231227_092640__978.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5945 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 6 | 20231214_090527__766 | 0 | 0.0 | 25.2213 | 2 | [365, 596] | 0.4.0 | 2 | 1.1 | {} | PromptingTools.OllamaManagedSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231214_090527__766.json | 75.0 | missing | missing | missing | |
| 5946 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_072708__873 | 0 | 0.0 | 57.2056 | 0 | [376, 363] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_072708__873.json | 50.0 | missing | {\n "num_gpu": 99\n} | missing | |
| 5947 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapTask | 1SHOT | true | true | 6 | 20231225_072851__778 | 0 | 0.0 | 103.232 | 1 | [376, 667] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231225_072851__778.json | 62.5 | missing | {\n "num_gpu": 99\n} | missing | |
| 5948 | Apple-MacBook-Pro-M1 | wrap_string | yi:34b-chat | JuliaRecapTask | 1SHOT | true | false | 6 | 20231227_092557__171 | 0 | 0.0 | 45.4291 | 0 | [376, 286] | 0.6.0 | 2 | 1.1 | {\n "options": {\n "num_gpu": 99\n }\n} | PromptingTools.OllamaSchema() | /home/runner/work/Julia-LLM-Leaderboard/Julia-LLM-Leaderboard/code_generation/utility_functions/wrap_string/yi:34b-chat/evaluation__JuliaRecapTask__1SHOT__20231227_092557__171.json | 25.0 | missing | {\n "num_gpu": 99\n} | missing |
Model Comparison
Highest average score by model:
fig = @chain df begin
@by [:model] begin
:cost = mean(:cost)
:elapsed = mean(:elapsed_seconds)
:score = mean(:score)
end
transform(_, names(_, Number) .=> ByRow(x -> round(x, digits = 1)), renamecols = false)
@orderby -:score
@aside local order_ = _.model
data(_) *
mapping(:model => sorter(order_) => "Model",
:score => "Avg. Score (Max 100 pts)") * visual(BarPlot; bar_labels = :y, label_offset = 0)
draw(;
figure = (; size = (900, 600)),
axis = (;
limits = (nothing, nothing, 0, 100),
xticklabelrotation = 45,
title = "Open-Source LLM Model Performance [PRELIMINARY]"))
end
figTable:
output = @chain df begin
@by [:model] begin
:elapsed = mean(:elapsed_seconds)
:elapsed_median = mean(:elapsed_seconds)
:score = mean(:score)
:score_median = median(:score)
:count_zero_score = count(iszero, :score)
:count_full_score = count(isone, :score)
end
transform(_, names(_, Number) .=> ByRow(x -> round(x, digits = 1)), renamecols = false)
@orderby -:score
end
# markdown_table(output, String) |> clipboard
markdown_table(output)| model | elapsed | elapsed_median | score | score_median | countzeroscore | countfullscore |
|---|---|---|---|---|---|---|
| magicoder:7b-s-cl-q6_K | 15.6 | 15.6 | 61.9 | 60.0 | 11.0 | 0.0 |
| phind-codellama:34b-v2 | 37.1 | 37.1 | 61.2 | 72.5 | 28.0 | 0.0 |
| magicoder | 12.8 | 12.8 | 53.7 | 50.0 | 41.0 | 0.0 |
| deepseek-coder:33b-instruct-q4KM | 46.7 | 46.7 | 53.4 | 50.0 | 48.0 | 0.0 |
| nous-hermes2:34b-yi-q4KM | 56.8 | 56.8 | 50.8 | 50.0 | 65.0 | 0.0 |
| codellama:13b-instruct | 18.1 | 18.1 | 50.5 | 50.0 | 52.0 | 0.0 |
| openhermes2.5-mistral | 12.9 | 12.9 | 50.4 | 50.0 | 35.0 | 0.0 |
| starling-lm:latest | 13.7 | 13.7 | 50.0 | 50.0 | 27.0 | 0.0 |
| openchat:7b-v3.5-1210-q4KM | 14.4 | 14.4 | 47.7 | 50.0 | 35.0 | 0.0 |
| yi:34b-chat | 43.9 | 43.9 | 46.3 | 50.0 | 35.0 | 0.0 |
| mistral:7b-instruct-v0.2-q4_0 | 12.4 | 12.4 | 45.8 | 50.0 | 52.0 | 0.0 |
| mistral:7b-instruct-v0.2-q6_K | 21.7 | 21.7 | 45.7 | 50.0 | 31.0 | 0.0 |
| mistral:7b-instruct-v0.2-q4KM | 15.6 | 15.6 | 44.6 | 50.0 | 47.0 | 0.0 |
| solar:10.7b-instruct-v1-q4KM | 18.8 | 18.8 | 32.7 | 25.0 | 79.0 | 0.0 |
| mistral:7b-instruct-q4KM | 13.9 | 13.9 | 32.3 | 25.0 | 60.0 | 0.0 |
| llama2 | 17.1 | 17.1 | 28.3 | 25.0 | 93.0 | 0.0 |
| orca2:13b | 20.1 | 20.1 | 23.8 | 0.0 | 145.0 | 0.0 |
| stablelm-zephyr | 9.9 | 9.9 | 21.3 | 25.0 | 125.0 | 0.0 |
| dolphin-phi:2.7b-v2.6-q6_K | 8.9 | 8.9 | 18.1 | 0.0 | 154.0 | 0.0 |
| codellama:13b-python | 12.5 | 12.5 | 13.8 | 0.0 | 140.0 | 0.0 |
| phi:2.7b-chat-v2-q6_K | 13.0 | 13.0 | 8.0 | 0.0 | 197.0 | 0.0 |
Overview by Prompt Template
Bar chart with all OSS models and various prompt templates
fig = @chain df begin
@by [:model, :prompt_label] begin
:cost = mean(:cost)
:elapsed = mean(:elapsed_seconds)
:score = mean(:score)
:score_median = median(:score)
:cnt = $nrow
end
@aside local average_ = @by(_, :model, :avg=mean(:score)) |>
x -> @orderby(x, -:avg).model
data(_) *
mapping(:model => sorter(average_) => "Model",
:score => "Avg. Score (Max 100 pts)",
color = :prompt_label => "Prompts",
dodge = :prompt_label) * visual(BarPlot)
draw(; figure = (size = (900, 600),),
axis = (xticklabelrotation = 45, title = "Comparison for OSS Models [PRELIMINARY]"),
legend = (; position = :bottom))
end
SAVE_PLOTS && save("assets/model-prompt-comparison-oss.png", fig)
figTable:
output = @chain df begin
@by [:model, :prompt_label] begin
:cost = mean(:cost)
:elapsed = mean(:elapsed_seconds)
:score = mean(:score)
end
@aside average_ = @by _ :model :AverageScore=mean(:score) |> x -> round(x, digits = 1)
unstack(:model, :prompt_label, :score; fill = 0.0)
transform(_, names(_, Number) .=> ByRow(x -> round(x, digits = 1)), renamecols = false)
leftjoin(average_, on = :model)
@orderby -:AverageScore
end
# markdown_table(output, String) |> clipboard
markdown_table(output)| model | InJulia | JuliaExpertAsk | JuliaExpertCoTTask | JuliaRecapCoTTask | JuliaRecapTask | AverageScore |
|---|---|---|---|---|---|---|
| magicoder:7b-s-cl-q6_K | 63.3 | 61.8 | 54.5 | 62.6 | 67.4 | 61.9 |
| phind-codellama:34b-v2 | 60.6 | 69.7 | 59.1 | 59.1 | 57.8 | 61.2 |
| magicoder | 59.6 | 51.0 | 46.4 | 57.0 | 54.5 | 53.7 |
| deepseek-coder:33b-instruct-q4KM | 58.9 | 39.8 | 48.7 | 61.6 | 58.1 | 53.4 |
| nous-hermes2:34b-yi-q4KM | 58.8 | 37.7 | 52.8 | 46.2 | 58.8 | 50.8 |
| codellama:13b-instruct | 55.5 | 51.8 | 45.6 | 47.9 | 51.7 | 50.5 |
| openhermes2.5-mistral | 50.3 | 52.3 | 54.6 | 42.2 | 52.6 | 50.4 |
| starling-lm:latest | 53.6 | 57.9 | 38.2 | 46.9 | 53.5 | 50.0 |
| openchat:7b-v3.5-1210-q4KM | 49.8 | 47.8 | 44.3 | 47.6 | 49.1 | 47.7 |
| yi:34b-chat | 46.1 | 52.4 | 41.1 | 42.9 | 49.1 | 46.3 |
| mistral:7b-instruct-v0.2-q4_0 | 50.3 | 41.4 | 45.6 | 45.6 | 45.9 | 45.8 |
| mistral:7b-instruct-v0.2-q6_K | 42.6 | 39.8 | 48.1 | 48.8 | 49.3 | 45.7 |
| mistral:7b-instruct-v0.2-q4KM | 41.0 | 49.3 | 41.4 | 43.8 | 47.5 | 44.6 |
| solar:10.7b-instruct-v1-q4KM | 42.0 | 36.0 | 17.6 | 32.5 | 35.8 | 32.8 |
| mistral:7b-instruct-q4KM | 34.7 | 31.6 | 31.3 | 31.6 | 32.3 | 32.3 |
| llama2 | 26.1 | 33.0 | 29.7 | 28.4 | 24.1 | 28.3 |
| orca2:13b | 28.1 | 16.9 | 27.3 | 23.7 | 23.2 | 23.8 |
| stablelm-zephyr | 22.0 | 17.8 | 18.8 | 22.2 | 25.8 | 21.3 |
| dolphin-phi:2.7b-v2.6-q6_K | 22.3 | 18.3 | 14.9 | 16.3 | 18.6 | 18.1 |
| codellama:13b-python | 11.4 | 14.4 | 15.1 | 13.0 | 14.9 | 13.8 |
| phi:2.7b-chat-v2-q6_K | 7.3 | 6.5 | 6.8 | 9.2 | 10.2 | 8.0 |
Other Considerations
Comparison of Time-to-generate vs Average Score
fig = @chain df begin
@aside local xlims = quantile(df.elapsed_seconds, [0.01, 0.99])
@by [:model, :prompt_label] begin
:elapsed = mean(:elapsed_seconds)
:elapsed_median = median(:elapsed_seconds)
:score = mean(:score)
:score_median = median(:score)
:cnt = $nrow
end
data(_) * mapping(:elapsed => "Avg. Elapsed Time (s)",
:score => "Avg. Score (Max 100 pts)",
color = :model => "Model")
draw(; figure = (size = (800, 800),),
axis = (xticklabelrotation = 45,
title = "Elapsed Time vs Score for Paid APIs [PRELIMINARY]",
limits = (xlims..., nothing, nothing)),
palettes = (; color = Makie.ColorSchemes.tab20.colors))
end
SAVE_PLOTS && save("assets/elapsed-vs-score-scatter-paid.png", fig)
figThis page was generated using Literate.jl.